nep-ecm New Economics Papers
on Econometrics
Issue of 2015‒04‒11
24 papers chosen by
Sune Karlsson
Örebro universitet

  1. Testing Missing at Random using Instrumental Variables By Christoph Breunig; ; ;
  2. Testing for normality with applications By Marian Vavra
  3. Autoregressive moving average infinite hidden markov-switching models By Bauwens, Luc; Carpantier, Jean-François; Dufays, Arnaud
  4. JOINT SPECIFICATION TESTS FOR RESPONSE PROBABILITIES IN UNORDERED MULTINOMIAL CHOICE MODELS By Masamune Iwasawa
  5. Inference and Testing Breaks in Large Dynamic Panels with Strong Cross Sectional Dependence By Javier Hidalgo; Marcia M Schafgans
  6. "Comparison of Linear Shrinkage Estimators of a Large Covariance Matrix in Normal and Non-normal Distributions" By Yuki Ikeda; Tatsuya Kubokawa; Muni S. Srivastava
  7. Kernel filtering of spot volatility in presence of Lévy jumps and market microstructure noise By Yu, Chao; Fang, Yue; Zhao, Xujie; Zhang, Bo
  8. Refinements in maximum likelihood inference on spatial autocorrelation in panel data By Peter Robinson; Francesca Rossi
  9. Optimal Sup-norm Rates, Adaptivity and Inference in Nonparametric Instrumental Variables Estimation By Xiaohong Chen; Timothy Christensen
  10. Structural Vector Autoregressions with Heteroskedasticity: A Comparison of Different Volatility Models By Helmut Lütkepohl; Aleksei Netsunajev
  11. Instrumental variable estimation in functional linear models By Van Bellegem, Sébastien; Florens, Jean-Pierre
  12. Local Polynomial Derivative Estimation: Analytic or Taylor? By Jeffrey S. Racine
  13. Treatments of Non-metric Variables in Partial Least Squares and Principal Component Analysis By Jisu Yoon; Tatyana Krivobokova
  14. "A Robust Estimation of Integrated Volatility under Round-off Errors, Micro-market Price Adjustments and Noises" By Seisho Sato; Naoto Kunitomo
  15. A simple model for now-casting volatility series By Hafner, Christian M.; Breitung, Jörg
  16. Real-Time Forecasting with a MIDAS VAR By Heiner Mikosch; Stefan Neuwirth
  17. The “wrong skewness” problem: a re-specification of Stochastic Frontiers. By Bonanno, Graziella; De Giovanni, Domenico; Domma, Filippo
  18. Construction of value-at-risk forecasts under different distributional assumptions within a BEKK framework By Braione, Manuela; Scholtes, Nicolas K.
  19. A Quantal Response Model of Firm Competition By Ellis Scharfenaker
  20. Forecasting comparison of long term component dynamic models for realized covariance matrices By BAUWENS, Luc; BRAIONE, Manuela; STORTI, Giuseppe
  21. Pooling Multiple Case Studies Using Synthetic Controls: An Application to Minimum Wage Policies By Dube, Arindrajit; Zipperer, Ben
  22. A distinction between causal effects in structural and rubin causal models By Aliprantis, Dionissi
  23. A fixed effects ordered choice model with flexible thresholds with an application to life-satisfaction By Yaman, F.; Cubí‐Mollá, P.
  24. Misspecification and Expectations Correction in New Keynesian DSGE Models By Giovanni Angelini; Luca Fanelli Fanelli

  1. By: Christoph Breunig; ; ;
    Abstract: This paper proposes a test for missing at random (MAR). The MAR assumption is shown to be testable given instrumental variables which are independent of response given potential outcomes. A nonparametric testing procedure based on integrated squared distance is proposed. The statistic’s asymptotic distribu- tion under the MAR hypothesis is derived. We demonstrate that our results can be easily extended to a test of missing completely at random (MCAR) and miss- ing completely at random conditional on covariates X (MCAR(X)). A Monte Carlo study examines finite sample performance of our test statistic. An empirical illustration concerns pocket prescription drug spending with missing values; we reject MCAR but fail to reject MAR.
    Keywords: Incomplete data, missing-data mechanism, selection model, nonparametric hypothesis testing, consistent testing, instrumental variable, series estimation.
    JEL: C12 C14
    Date: 2015–03
    URL: http://d.repec.org/n?u=RePEc:hum:wpaper:sfb649dp2015-016&r=ecm
  2. By: Marian Vavra (National Bank of Slovakia, Research Department)
    Abstract: This paper considers the problem of testing for normality of the marginal law of univariate and multivariate stationary and weakly dependent random processes using a bootstrap-based Anderson-Darling test statistic. The finite-sample properties of the test are assessed via Monte Carlo experiments. An application to the inflation forecast errors is also presented.
    Keywords: testing for normality; Anderson-Darling statistic; sieve bootstrap; weak dependence
    JEL: C12 C15 C32
    Date: 2015–03
    URL: http://d.repec.org/n?u=RePEc:svk:wpaper:1031&r=ecm
  3. By: Bauwens, Luc (Université catholique de Louvain, CORE, Belgium); Carpantier, Jean-François (CREA, University of Luxembourg); Dufays, Arnaud (Université catholique de Louvain, CORE, Belgium)
    Abstract: Markov-switching models are usually specified under the assumption that all the parameters change when a regime switch occurs. Relaxing this hypothesis and being able to detect which parameters evolve over time is relevant for interpreting the changes in the dynamics of the series, for specifying models parsimoniously, and may be helpful in forecasting. We propose the class of sticky infinite hidden Markov-switching autoregressive moving average models, in which we disentangle the break dynamics of the mean and the variance parameters. In this class, the number of regimes is possibly infinite and is determined when estimating the model, thus avoiding the need to set this number by a model choice criterion. We develop a new Markov chain Monte Carlo estimation method that solves the path dependence issue due to the moving average component. Empirical results on macroeconomic series illustrate that the proposed class of models dominates the model with fixed parameters in terms of point and density forecasts.
    Keywords: ARMA, Bayesian inference, Dirichlet process, Forecasting, Marko v-switching
    JEL: C11 C15 C22 C53 C58
    Date: 2015–02–13
    URL: http://d.repec.org/n?u=RePEc:cor:louvco:2015007&r=ecm
  4. By: Masamune Iwasawa (Graduate School of Economics, Kyoto University, Research Fellow of Japan Society for the Promotion of Science)
    Abstract: Estimation results obtained by parametric models may be seriously misleading when the model is misspecified or poorly approximates the true model. This study proposes two tests that jointly test the specifications of multiple response probabilities in unordered multino- mial choice models. Both test statistics are asymptotically chi-square distributed, consistent against a fixed alternative, and able to detect a local alternative approaching to the null at a rate slower than the parametric rate. We show that rejection regions can be calculated by a simple parametric bootstrap procedure, when the sample size is small. The size and power of the tests are investigated by Monte Carlo experiments.
    Date: 2015–03
    URL: http://d.repec.org/n?u=RePEc:kyo:wpaper:919&r=ecm
  5. By: Javier Hidalgo; Marcia M Schafgans
    Abstract: This paper is concerned with various issues related to inference in large dynamic panel data models (where both n and T increase without bound) in the presence of, possibly, strong cross-sectional dependence. Our first aim is to provide a Central Limit Theorem for estimators of the slope parameters of the model under mild conditions. To that end, we extend and modify existing results available in the literature. Our second aim is to study two, although similar, tests for breaks/homogeneity in the time dimension. The first test is based on the CUSUM principle; whereas the second test is based on a Hausman-Durbin-Wu approach. Some of the key features of the tests are that they have nontrivial power when the number of individuals, for which the slope parameters may differ, is a "negligible" fraction or when the break happens to be towards the end of the sample. Due to the fact that the asymptotic distribution of the tests may not provide a good approximation for their finite sample distribution, we describe a simple bootstrap algorithm to obtain (asymptotic) valid critical values for our statistics. An important and surprising feature of the bootstrap is that there is no need to know the underlying model of the cross-sectional dependence, and hence the bootstrap does not require to select any bandwidth parameter for its implementation, as is the case with moving block bootstrap methods which may not be valid with cross-sectional dependence and may depend on the particular ordering of the individuals. Finally, we present a Monte-Carlo simulation analysis to shed some light on the small sample behaviour of the tests and their bootstrap analogues.
    Keywords: Large panel data, dynamic models, cross-sectional strong-dependence, central limit theorems, homogeneity, bootstrap algorithms
    JEL: C12 C13 C23
    Date: 2015–04
    URL: http://d.repec.org/n?u=RePEc:cep:stiecm:/2015/583&r=ecm
  6. By: Yuki Ikeda (Graduate School of Economics, The University of Tokyo); Tatsuya Kubokawa (Faculty of Economics, The University of Tokyo); Muni S. Srivastava (Department of Statistics, University of Toronto)
    Abstract: The problem of estimating the large covariance matrix of both normal and non-normal distributions is addressed. In convex combinations of the sample covariance matrix and the identity matrix multiplied by a scalor statistic, we suggest a new estimator of the optimal weight based on exact or approximately unbiased estimators of the numerator and denominator of the optimal weight in non-normal cases.  It is also demonstrated that the estimators given in the literature have second-order biases. It is numerically shown that the proposed estimator has a good risk performance. --
    Date: 2015–03
    URL: http://d.repec.org/n?u=RePEc:tky:fseres:2015cf970&r=ecm
  7. By: Yu, Chao; Fang, Yue; Zhao, Xujie; Zhang, Bo
    Abstract: This paper considers the problem of estimating spot volatility in the simultaneous presence of Lévy jumps and market microstructure noise. We propose to use the pre-averaging approach and the threshold kernel-based method to construct a spot volatility estimator, which is robust to both microstructure noise and jumps of either finite or infinite activity. The estimator is consistent and asymptotically normal, with a fast convergence rate. Our estimator is general enough to include many existing kernel-based estimators as special cases. When the kernel bandwidth is fixed, our estimator leads to widely used estimators of integrated volatility. Monte Carlo simulations show that our estimator works very well.
    Keywords: high-frequency data, spot volatility, Lévy jump, kernel estimation, microstructure noise, pre-averaging
    JEL: C13 C58
    Date: 2013–03–23
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:63293&r=ecm
  8. By: Peter Robinson; Francesca Rossi
    Abstract: In a panel data model with fixed effects, possible cross-sectional dependence is investigated in a spatial autoregressive setting. An Edgeworth expansion is developed for the maximum likelihood estimate of the spatial correlation coefficient. The expansion is used to develop more accurate interval estimates for the coefficient, and tests for cross-sectional independence that have better size properties, than corresponding rules of statistical inference based on first order asymptotic theory. Comparisons of finite sample performance are carried out using Monte Carlo simulations.
    Keywords: Panel data; Fixed effects; Spatial autoregression; Edgeworth expansion; Interval estimates; Tests for cross-sectional independence
    JEL: C12 C21 C31
    Date: 2015–03
    URL: http://d.repec.org/n?u=RePEc:ehl:lserod:61432&r=ecm
  9. By: Xiaohong Chen (Cowles Foundation, Yale University); Timothy Christensen (Dept. of Economics, Yale University)
    Abstract: This paper makes several contributions to the literature on the important yet difficult problem of estimating functions nonparametrically using instrumental variables. First, we derive the minimax optimal sup-norm convergence rates for nonparametric instrumental variables (NPIV) estimation of the structural function h_0 and its derivatives. Second, we show that a computationally simple sieve NPIV estimator can attain the optimal sup-norm rates for h_0 and its derivatives when h_0 is approximated via a spline or wavelet sieve. Our optimal sup-norm rates surprisingly coincide with the optimal L^2-norm rates for severely ill-posed problems, and are only up to a [log(n)]^epsilon (with epsilon < 1/2) factor slower than the optimal L^2-norm rates for mildly ill-posed problems. Third, we introduce a novel data-driven procedure for choosing the sieve dimension optimally. Our data-driven procedure is sup-norm rate-adaptive: the resulting estimator of h_0 and its derivatives converge at their optimal sup-norm rates even though the smoothness of h_0 and the degree of ill-posedness of the NPIV model are unknown. Finally, we present two non-trivial applications of the sup-norm rates to inference on nonlinear functionals of h_0 under low-level conditions. The first is to derive the asymptotic normality of sieve t-statistics for exact consumer surplus and deadweight loss functionals in nonparametric demand estimation when prices, and possibly incomes, are endogenous. The second is to establish the validity of a sieve score bootstrap for constructing asymptotically exact uniform confidence bands for collections of nonlinear functionals of h_0. Both applications provide new and useful tools for empirical research on nonparametric models with endogeneity.
    Keywords: Ill-posed inverse problems, Series 2SLS, Optimal sup-norm convergence rates, Adaptive estimation, Random matrices, Bootstrap uniform confidence bands, Nonlinear welfare functionals, Nonparametric demand analysis with endogeneity
    JEL: C13 C14 C32
    Date: 2013–11
    URL: http://d.repec.org/n?u=RePEc:cwl:cwldpp:1923r&r=ecm
  10. By: Helmut Lütkepohl; Aleksei Netsunajev
    Abstract: A growing literature uses changes in residual volatility for identifying structural shocks in vector autoregressive (VAR) analysis. A number of different models for heteroskedasticity or conditional heteroskedasticity are proposed and used in applications in this context. This study reviews the different volatility models and points out their advantages and drawbacks. It thereby enables researchers wishing to use identification of structural VAR models via heteroskedasticity to make a more informed choice of a suitable model for a specific empirical analysis. An application investigating the interaction between U.S. monetary policy and the stock market is used to illustrate the related issues.
    Keywords: Structural vector autoregression, identification via heteroskedasticity, conditional heteroskedasticity, smooth transition, Markov switching, GARCH
    JEL: C32
    Date: 2015
    URL: http://d.repec.org/n?u=RePEc:diw:diwwpp:dp1464&r=ecm
  11. By: Van Bellegem, Sébastien (Université catholique de Louvain, CORE, Belgium); Florens, Jean-Pierre (Toulouse School of Economics)
    Abstract: In an increasing number of empirical studies, the dimensionality measured e.g. as the size of the parameter space of interest, can be very large. Two instances of large dimensional models are the linear regression with a large number of covariates and the estimation of a regression function with many instrumental variables. An appropriate setting to analyze high dimensional problems is provided by a functional linear model, in which the covariates belong to Hilbert spaces. This paper considers the case where covariates are endogenous and assumes the existence of instrumental variables (that are functional as well). The paper shows that estimating the regression function is a linear ill-posed inverse problem, with a known but data-dependent operator. The first contribution is to analyze the rate of convergence of the penalized least squares estimator. Based on the result, we discuss the notion of “instrument strength” in the high dimensional setting. We also discuss a generalized version of the estimator, when the problem is premultiplied by an instrument-dependent operator. This extends the technology of Generalized Method of Moments to high dimensional, functional data. A central limit theorem is also established on the inner product of the estimator. The studied estimators are easy and fast to implement, and the finite-sample performance is discussed through simulations and an application to the impact of age-specific fertility rate curves on yearly economic growth in the United Kingdom.
    Keywords: inventory routing, valid inequalities, cutting planes
    JEL: C26 C14
    Date: 2014–09–30
    URL: http://d.repec.org/n?u=RePEc:cor:louvco:2014056&r=ecm
  12. By: Jeffrey S. Racine
    Abstract: Local polynomial regression is extremely popular in applied settings. Recent developments in shape constrained nonparametric regression allow practitioners to impose constraints on local polynomial estimators thereby ensuring that the resulting estimates are consistent with underlying theory. However, it turns out that local polynomial derivative estimates may fail to coincide with the analytic derivative of the local polynomial regression estimate which can be problematic, particularly in the context of shape constrained estimation. In such cases practitioners might prefer to instead use analytic derivatives along the lines of those proposed in the local constant setting by Rilstone & Ullah (1989). Demonstrations and applications are considered.
    Keywords: nonparametric, smoothing, constrained estimation
    Date: 2015–03
    URL: http://d.repec.org/n?u=RePEc:mcm:deptwp:2015-02&r=ecm
  13. By: Jisu Yoon (Georg-August-University Göttingen); Tatyana Krivobokova (Georg-August-University Göttingen)
    Abstract: This paper reviews various treatments of non-metric variables in Partial Least Squares (PLS) and Principal Component Analysis (PCA) algorithms. The performance of different treatments is compared in the extensive simulation study under several typical data generating processes and recommendations are made. An application of PLS and PCA algorithms with non-metric variables to the generation of a wealth index is considered.
    Keywords: Principal Component Analysis; PCA; Partial Least Squares; PLS; non-metric variables; simulation; wealth index
    JEL: C15 C43 R20
    Date: 2015–03–27
    URL: http://d.repec.org/n?u=RePEc:got:gotcrc:172&r=ecm
  14. By: Seisho Sato (Faculty of Economics, The University of Tokyo); Naoto Kunitomo (Faculty of Economics, The University of Tokyo)
    Abstract: For estimating the integrated volatility by using high frequency data, Kunitomo and Sato (2008, 2011, 2013) have proposed the Separating Information Maximum Likelihood (SIML) method when there are micro-market noises. The SIML estimator has reasonable nite sample properties and asymptotic properties when the sample size is large under reasonable conditions. We show that the SIML estimator has the robustness properties in the sense that it is consistent and has the stable convergence (i.e. the asymptotic normality in the deterministic case) when there are round- off errors and micro-market price adjustments and noises for the underlying (continuous time) stochastic process. The SIML estimation has also reasonable nite sample properties with these effects and dominate the existing methods such as the realized kernel method and the pre-averaging method in some situations. --
    Date: 2015–03
    URL: http://d.repec.org/n?u=RePEc:tky:fseres:2015cf964&r=ecm
  15. By: Hafner, Christian M. (Université catholique de Louvain, CORE, Belgium); Breitung, Jörg (University of Cologne)
    Abstract: Nowcasting volatility of financial time series appears difficult with classical volatility models. This paper proposes a simple model, based on an ARMA representation of the log-transformed squared returns, that allows to estimate current volatility, given past and current returns, in a very simple way. The model can be viewed as a degenerate case of the stochastic volatility model with perfect correlation between the two error terms. It is shown that the volatility nowcasts do not depend on this correlation, so that both models provide the same nowcasts for given parameter values. A simulation study suggests that the ARMA and SV models have a similar performance, but that in cases of moderate persistence the ARMA model is preferable. An extension of the ARMA model is proposed that takes into account the so-called leverage effect. Finally, the alternative models are applied to a long series of daily S&P 500 returns.
    Keywords: EGARCH, stochastic volatility, ARMA, realized volatility
    JEL: C22 C58
    Date: 2014–11–19
    URL: http://d.repec.org/n?u=RePEc:cor:louvco:2014060&r=ecm
  16. By: Heiner Mikosch (KOF Swiss Economic Institute, ETH Zurich, Switzerland); Stefan Neuwirth (KOF Swiss Economic Institute, ETH Zurich, Switzerland)
    Abstract: This paper presents a MIDAS type mixed frequency VAR forecasting model. First, we propose a general and compact mixed frequency VAR framework using a stacked vector approach. Second, we integrate the mixed frequency VAR with a MIDAS type Almon lag polynomial scheme which is designed to reduce the parameter space while keeping models flexible. We show how to recast the resulting non-linear MIDAS type mixed frequency VAR into a linear equation system that can be easily estimated. A pseudo out-of-sample forecasting exercise with US real-time data yields that mixed frequency VAR substantially improves predictive accuracy upon a standard VAR for different VAR specifications. Forecast errors for, e.g., GDP growth decrease by 30 to 60 percent for forecast horizons up to six months and by around 20 percent for a forecast horizon of one year.
    Keywords: Forecasting, mixed frequency data, MIDAS, VAR, real time
    JEL: C53 E27
    Date: 2015–04
    URL: http://d.repec.org/n?u=RePEc:kof:wpskof:15-377&r=ecm
  17. By: Bonanno, Graziella; De Giovanni, Domenico; Domma, Filippo
    Abstract: In this paper, we study the so-called “wrong skewness” anomaly in Stochastic Frontiers (SF), which consists in the observed difference between the expected and estimated sign of the asymmetry of the composite error. We propose a more general and flexible specification of the SF model, introducing dependence between the two error components and asymmetry (positive or negative) of the random error. This re-specification allows us to decompose the third moment of the composite error in three components, namely: i) the asymmetry of the inefficiency term; ii) the asymmetry of the random error; and iii) the structure of dependence between the error components. This decomposition suggests that the “wrong skewness” anomaly is an ill-posed problem, because we cannot establish ex ante the expected sign of the asymmetry of the composite error. We report a relevant special case that allows us to estimate the three components of the asymmetry of the composite error and, consequently, to interpret the estimated sign. We present two empirical applications. In the first dataset, where the classic SF displays wrong skewness, estimation of our model rejects the dependence hypothesis, but accepts the asymmetry of the random error, thus justifying the sign of the skewness of the composite error. In the second dataset, where the classic SF does not display any anomaly, estimation of our model provides evidence of the presence of both dependence between the error components and asymmetry of the random error.
    Keywords: Keywords: Stochastic frontier models, Skewness, Generalised Logistic distribution, Dependence, Copula functions.
    JEL: C13 C18 C46 D24
    Date: 2015
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:63429&r=ecm
  18. By: Braione, Manuela (Université catholique de Louvain, CORE, Belgium); Scholtes, Nicolas K. (Université catholique de Louvain, CORE, Belgium)
    Abstract: Financial asset returns are known to be conditionally heteroskedastic and generally non-normally distributed, fat-tailed and often skewed. In order to account for both the skewness and the excess kurtosis in returns, we combine the BEKK model from the multivariate GARCH literature with different multivariate densities for the returns. The set of distributions we consider comprises the normal, Student, Multivariate Exponential Power and their skewed counterparts. Applying this framework to a sample of ten assets from the Dow Jones Industrial Average Index, we compare the performance of equally- weighted portfolios derived from the symmetric and skewed distributions in forecasting out-of-sample Value-at-Risk. The accuracy of the VaR forecasts is assessed by implementing standard statistical backtesting procedures. The results unanimously show that the inclusion of fat-tailed densities into the model specification yields more accurate VaR forecasts, while the further addition of skewness does not lead to significant improvements.
    Keywords: Dow Jones industrial average, BEKK model, maximum likelihood, value-at-risk
    JEL: C01 C22 C52 C58
    Date: 2014–11–18
    URL: http://d.repec.org/n?u=RePEc:cor:louvco:2014059&r=ecm
  19. By: Ellis Scharfenaker (Department of Economics, New School for Social Research)
    Abstract: The distribution of prot rates in the U.S. economy for 21,714 rms from 1962 - 2012 appears to be highly organized in a Laplace-like distribution. Pos- itive prot rate deviations from the mode appear to be remarkably stationary over time displaying little parametric changes while negative prot rate devi- ations introduce an asymmetry into the distribution that appears to uctuate over time. In this paper I propose a model of \classically" competitive rms facing informational entropy constraints in their decisions to potentially enter or exit markets based on prot rate dierentials. The result is a three parameter logit quantal response distribution for rm entry and exit decisions. Bayesian methods are used for inference into the the distribution of entry and exit deci- sions conditional on prot rate deviations and rm level data from Compustat is used to test these predictions. The model parameters show a uctuating asymmetry in rm exit decisions, an increase in dispersion of negative prot rate dierentials, and a falling general rate of prot.
    Keywords: Firm competition, Laplace distribution, Gibbs sampler, prot rate, statistical equilibrium, rational inattention, information theory, quantal response
    JEL: C10 C15 D20 D22 E10 L11
    Date: 2015–03
    URL: http://d.repec.org/n?u=RePEc:new:wpaper:1507&r=ecm
  20. By: BAUWENS, Luc (Université catholique de Louvain, CORE, Belgium); BRAIONE, Manuela (Université catholique de Louvain, CORE, Belgium); STORTI, Giuseppe (Université di Salerno)
    Abstract: Novel model specifications that include a time-varying long run component in the dynamics of realized covariance matrices are proposed. The adopted modeling framework allows the secular component to enter the model structure either in an additive fashion or as a multiplicative factor, and to be specified parametrically, using a MIDAS filter, or non-parametrically. Estimation is performed by maximizing a Wishart quasi-likelihood function. The one-step ahead forecasting performance of the models is assessed by means of three approaches: the Model Confidence Set, (global) minimum variance portfolios and Value-at-Risk. The results provide evidence in favour of the hypothesis that the proposed models outperform benchmarks incorporating a constant long run component, both in and out-of sample.
    Keywords: Realized covariance, component dynamic models, MIDAS, minimum variance portfolio, Model Confidence Set, Value-at-Risk
    JEL: C13 C32 C58
    Date: 2014–11–30
    URL: http://d.repec.org/n?u=RePEc:cor:louvco:2014053&r=ecm
  21. By: Dube, Arindrajit (University of Massachusetts Amherst); Zipperer, Ben (University of Massachusetts Amherst)
    Abstract: We propose a simple, distribution-free method for pooling synthetic control case studies using the mean percentile rank. We also test for heterogeneous treatment effects using the distribution of estimated ranks, which has a known form. We propose a cross-validation based procedure for model selection. Using 29 cases of state minimum wage increases between 1979 and 2013, we find a sizable, positive and statistically significant effect on the average teen wage. We do detect heterogeneity in the wage elasticities, consistent with differential bites in the policy. In contrast, the employment estimates suggest a small constant effect not distinguishable from zero.
    Keywords: synthetic controls, program evaluation, heterogeneous treatment effects, minimum wage
    JEL: J38 J23 J88
    Date: 2015–03
    URL: http://d.repec.org/n?u=RePEc:iza:izadps:dp8944&r=ecm
  22. By: Aliprantis, Dionissi (Federal Reserve Bank of Cleveland)
    Abstract: Structural Causal Models define causal effects in terms of a single Data Generating Process (DGP), and the Rubin Causal Model defines causal effects in terms of a model that can represent counterfactuals from many DGPs. Under these different definitions, notationally similar causal effects make distinct claims about the results of interventions to the system under investigation: Structural equations imply conditional independencies in the data that potential outcomes do not. One implication is that the DAG of a Rubin Causal Model is different from the DAG of a Structural Causal Model. Another is that Pearl’s do-calculus does not apply to potential outcomes and the Rubin Causal Model.
    Keywords: Structural Equation; Potential Outcome; Invariance; Autonomy
    JEL: C00 C01 C31
    Date: 2015–03–27
    URL: http://d.repec.org/n?u=RePEc:fip:fedcwp:1505&r=ecm
  23. By: Yaman, F.; Cubí‐Mollá, P.
    Abstract: In many contexts reported outcomes in a rating scale are modeled through the existence of a latent variable that separates the categories through thresholds. The literature has not been able to separate the effect of a variable on the latent variable from its effect on threshold parameters. We propose a model which incorporates (1) individual fixed effects on the latent variable, (2) individual fixed effects on the thresholds and (3) threshold shifts across time depending on observable. Importantly, the latent variable and the threshold specifications can include common variables. In order to illustrate the estimator, we apply it to a model of life satisfaction using the GSOEP dataset. We demonstrate that important differences can arise depending on the choice of the model. Our model suggests that threshold shifts are statistically and quantitatively important. Factors which increase reported life-satisfaction are due both to positive effects on the latent variable AND to shifting thresholds to the left, while factors which decrease reported life satisfaction are due to negative effects on the latent variable AND to shifting thresholds to the right.
    Keywords: Ordered choice; fixed effects; subjective well-being; life-satisfaction
    URL: http://d.repec.org/n?u=RePEc:cty:dpaper:8123&r=ecm
  24. By: Giovanni Angelini; Luca Fanelli Fanelli
    Abstract: This paper focuses on the dynamic misspecification that characterizes the class of small-scale New-Keynesian models and provides a `natural' remedy for the typical difficulties these models have in accounting for the rich contemporaneous and dynamic correlation structure of the data, generally faced with ad hoc shock specifications. We suggest using the `best fitting' statistical model for the data as a device through which it is possible to adapt the econometric specification of the New-Keynesian model. The statistical model may feature an autocorrelation structure that is more involved than the autocorrelation structure implied by the structural model's reduced form solution under rational expectations, and it is treated as the actual agents' expectations generating mechanism. A pseudo-structural form is built from the baseline system of Euler equations by forcing the state vector of the system to have the same dimension as the state vector characterizing the statistical model. We provide an empirical illustration based on U.S. quarterly data and a small-scale monetary New Keynesian model.
    Keywords: Dynamic stochastic general equilibrium model, Expectations, Kalman filter, New Keynesian models, State space model.
    Date: 2015
    URL: http://d.repec.org/n?u=RePEc:bot:quadip:wpaper:125&r=ecm

This nep-ecm issue is ©2015 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.