
on Econometrics 
By:  Koo, Chao (Tilburg University, School of Economics and Management) 
Abstract:  This dissertation is composed of three essays on functional coefficient models (also referred to as varyingcoefficient models) in the time series context. The first essay proposes two estimators for a functional coefficient model with discontinuities in the coefficient functions. One is based on the weighted residual mean squared error, which works well only if the conditional error variance is continuous. The other estimator is based on the local Wald test statistics which is applicable even if the conditional error variance contains discontinuities. In the second essay, we introduce a new model – the semiparametric transition model, and propose an iterative estimation procedure which is based on the straightforward application of (local) least squares. Simulations demonstrate that the proposed estimation provides precise estimates for many types of transition functions. The third essay proposes an estimator for a functional coefficient model with endogenous variables. In contrast to the existing functional coefficient IV literature, our estimator is adapted to the case that coefficients are functions of an endogenous variable. To illustrate the utility of our approach, we provide an empirical example based on the relationship among the hourly wage rate, education level, and work experience. 
Date:  2018 
URL:  http://d.repec.org/n?u=RePEc:tiu:tiutis:ba87b8a53c5540ec967d9eab42c14ddf&r=ecm 
By:  Matthew A. Masten; Alexandre Poirier 
Abstract:  How should one assess the credibility of assumptions weaker than statistical independence, like quantile independence? In the context of identifying causal effects of a treatment variable, we argue that such deviations should be chosen based on the form of selection on unobservables they allow. For quantile independence, we characterize this form of treatment selection. Specifically, we show that quantile independence is equivalent to a constraint on the average value of either a latent propensity score (for a binary treatment) or the cdf of treatment given the unobservables (for a continuous treatment). In both cases, this average value constraint requires a kind of nonmonotonic treatment selection. Using these results, we show that several common treatment selection models are incompatible with quantile independence. We introduce a class of assumptions which weakens quantile independence by removing the average value constraint, and therefore allows for monotonic treatment selection. In a potential outcomes model with a binary treatment, we derive identified sets for the ATT and QTT under both classes of assumptions. In a numerical example we show that the average value constraint inherent in quantile independence has substantial identifying power. Our results suggest that researchers should carefully consider the credibility of this nonmonotonicity property when using quantile independence to weaken full independence. 
Date:  2018–04 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1804.10957&r=ecm 
By:  Jonas Dovern (AlfredWeberInstitute for Economics, Heidelberg University); Hans Manner (University of Graz, Austria) 
Abstract:  Established tests for proper calibration of multivariate density forecasts based on Rosenblatt probability integral transforms can be manipulated by changing the order of variables in the forecasting model. We derive order invariant tests. The new tests are applicable to densities of arbitrary dimensions and can deal with parameter estimation uncertainty and dynamic misspecification. Monte Carlo simulations show that they often have superior power relative to established approaches. We use the tests to evaluate GARCHbased multivariate density forecasts for a vector of stock market returns. 
Keywords:  Density calibration; Goodnessoffit test; Predictive density; Rosenblatt transformation 
JEL:  C12 C32 C52 C53 
Date:  2018–04 
URL:  http://d.repec.org/n?u=RePEc:grz:wpaper:201809&r=ecm 
By:  Christian Hansen; Damian Kozbur; Sanjog Misra 
Abstract:  This paper proposes a postmodel selection inference procedure, called targeted undersmoothing, designed to construct uniformly valid confidence sets for functionals of sparse highdimensional models, including dense functionals that may depend on many or all elements of the highdimensional parameter vector. The confidence sets are based on an initially selected model and two additional models which enlarge the initial model. By varying the enlargements of the initial model, one can also conduct sensitivity analysis of the strength of empirical conclusions to model selection mistakes in the initial model. We apply the procedure in two empirical examples: estimating heterogeneous treatment effects in a job training program and estimating profitability from an estimated mailing strategy in a marketing campaign. We also illustrate the procedure’s performance through simulation experiments. 
Keywords:  model selection, sparsity, dense functionals, hypothesis testing, sensitivity analysis 
JEL:  C12 C51 
Date:  2016–08 
URL:  http://d.repec.org/n?u=RePEc:zur:econwp:282&r=ecm 
By:  Franses, Ph.H.B.F. 
Abstract:  This paper introduces the idea to adjust forecasts from a linear time series model where the adjustment relies on the assumption that this linear model is an approximation of for example a nonlinear time series model. This way to create forecasts can be convenient when inference for the nonlinear model is impossible, complicated or unreliable in small samples. The size of the forecast adjustment can be based on the estimation results for the linear model and on other data properties like the first few moments or autocorrelations. An illustration is given for an ARMA(1,1) model which is known to approximate a first order diagonal bilinear time series model. For this case, the forecast adjustment is easy to derive, which is convenient as the particular bilinear model is indeed cumbersome to analyze. An application to a range of inflation series for low income countries shows that such adjustment can lead to improved forecasts, although the gain is not large nor frequent 
Keywords:  ARMA(1, 1), Inflation, Firstorder diagonal bilinear time series model, Methods, of Moments, Adjustment of forecasts 
JEL:  C22 C53 
Date:  2018–03–01 
URL:  http://d.repec.org/n?u=RePEc:ems:eureir:105879&r=ecm 
By:  Søren Johansen (University of Copenhagen and CREATES); Morten Ørregaard Nielsen (Queen's University and CREATES) 
Abstract:  We consider the fractional cointegrated vector autoregressive (CVAR) model of Johansen and Nielsen (2012a) and make two distinct contributions. First, in their consistency proof, Johansen and Nielsen (2012a) imposed moment conditions on the errors that depend on the parameter space, such that when the parameter space is larger, stronger moment conditions are required. We show that these moment conditions can be relaxed, and for consistency we require just eight moments regardless of the parameter space. Second, Johansen and Nielsen (2012a) assumed that the cointegrating vectors are stationary, and we extend the analysis to include the possibility that the cointegrating vectors are nonstationary. Both contributions require new analysis and results for the asymptotic properties of the likelihood function of the fractional CVAR model, which we provide. Finally, our analysis follows recent research and applies a parameter space large enough that the usual (nonfractional) CVAR model constitutes an interior point and hence can be tested against the fractional model using a Ï‡Â²test. 
Keywords:  cointegration, fractional integration, likelihood inference, vector autoregressive model 
JEL:  C32 
Date:  2018–05 
URL:  http://d.repec.org/n?u=RePEc:qed:wpaper:1405&r=ecm 
By:  Chaohua Dong; Jiti Gao; Bin Peng 
Abstract:  This paper discusses a semiparametric singleindex model. The link function is allowed to be unbounded and has unbounded support that fill the gap in the literature. The link function is treated as a point in an infinitely many dimensional function space which enables us to derive the estimates for the index parameter and the link function simultaneously. This approach is different from the profile method commonly used in the literature. The estimator is derived from an optimization with the constraint of an identification condition for the index parameter, which solves an important problem in the literature of singleindex models. In addition, making use of a property of Hermite orthogonal polynomials, an explicit estimator for the index parameter is obtained. Asymptotic properties of the two estimators of the index parameter are established. Their efficiency is discussed in some special cases as well. The finite sample properties of the two estimators are demonstrated through an extensive Monte Carlo study and an empirical example. 
Keywords:  Asymptotic theory; closedform estimation; crosssectional model; Hermite series expansion. 
JEL:  C13 C14 C51 
Date:  2018 
URL:  http://d.repec.org/n?u=RePEc:msh:ebswps:20185&r=ecm 
By:  Laszlo Balazsi; Felix Chan; Laszlo Matyas 
Abstract:  This paper proposes a new estimation procedure called Event Count Estimator (ECE). The estimator is straightforward to implement and it is robust against outliers, censoring and excess zeros in the data. The paper establishes asymptotic properties of the new estimator and the theoretical results are supported by several Monte Carlo experiments. These also show that the estimator has reasonable properties in moderate to large samples. As such, the cost of inefficiency for robustness is negligible from an applied viewpoint. The practical usefulness of the new estimator is demonstrated via an empirical application of the Gravity Model of trade. 
Date:  2018–04–11 
URL:  http://d.repec.org/n?u=RePEc:ceu:econwp:2018_2&r=ecm 
By:  Huber, Florian 
Abstract:  In this paper, we provide a parsimonious means of estimating panel VARs with stochastic volatility. We assume that coefficients associated with domestic lagged endogenous variables arise from a finite mixture of Gaussian distribution. Shrinkage on the cluster size is introduced through suitable priors on the component weights and clusterrelevant quantities are identified through novel normalgamma shrinkage priors. To assess whether dynamic interdependencies between units are needed, we moreover impose shrinkage priors on the coefficients related to other countries' endogenous variables. Finally, our model controls for static interdependencies by assuming that the reduced form shocks of the model feature a factor stochastic volatility structure. We assess the merits of the proposed approach by using synthetic data as well as a real data application. In the empirical application, we forecast Eurozone unemployment rates and show that our proposed approach works well in terms of predictions. 
Keywords:  multi country models, density predictions, hierarchical modeling, factor stochastic volatility models 
Date:  2018–04 
URL:  http://d.repec.org/n?u=RePEc:wiw:wus005:6247&r=ecm 
By:  Cristina Amado (University of Minho and NIPE, CREATES and Aarhus University); Annastiina Silvennoinen (School of Economics and Finance, Queensland University of Technology); Timo Ter¨asvirta (CREATES and Aarhus University, C.A.S.E., HumboldtUniversit¨at zu Berlin) 
Abstract:  Univariate and multivariate GARCH type models with multiplicative decomposition of the variance to short and long run components are surveyed. The latter component can be either deterministic or stochastic. Examples of both types are studied. 
Keywords:  Conditional heteroskedasticity; Deterministically varying correlations; Multiplicative decomposition; Nonstationary volatility 
JEL:  C12 C32 C51 C52 
Date:  2018 
URL:  http://d.repec.org/n?u=RePEc:nip:nipewp:07/2018&r=ecm 
By:  Peter Hull 
Abstract:  Researchers increasingly leverage movement across multiple treatments to estimate causal effects. While these "mover regressions" are often motivated by a linear constanteffects model, it is not clear what they capture under weaker quasiexperimental assumptions. I show that binary treatment mover regressions recover a convex average of four differenceindifference comparisons and are thus causally interpretable under a standard parallel trends assumption. Estimates from multipletreatment models, however, need not be causal without stronger restrictions on the heterogeneity of treatment effects and timevarying shocks. I propose a class of twostep estimators to isolate and combine the large set of differenceindifference quasiexperiments generated by a mover design, identifying mover average treatment effects under conditionaloncovariate parallel trends and effect homogeneity restrictions. I characterize the efficient estimators in this class and derive specification tests based on the model's overidentifying restrictions. Future drafts will apply the theory to the Finkelstein et al. (2016) movers design, analyzing the causal effects of geography on healthcare utilization. 
Date:  2018–04 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1804.06721&r=ecm 
By:  Stéphane Bonhomme (University of Chicago); Koen Jochmans (Département d'économie); JeanMarc Robin (Département d'économie) 
Abstract:  We propose a twostep method to nonparametrically estimate multivariate models in which the observed outcomes are independent conditional on a discrete latent variable. Applications include microeconometric models with unobserved types of agents, regimeswitching models, and models with misclassification error. In the first step, we estimate weights that transform moments of the marginal distribution of the data into moments of the conditional distribution of the data for given values of the latent variable. In the second step, these conditional moments are estimated as weighted sample averages. We illustrate the method by estimating a model of wages with unobserved heterogeneity on PSID data. 
Keywords:  Latent variable models; Unobserved heterogeneity; Finite mixtures; Hidden Markov models; Nonparametric estimation; Panel data; Wage dynamics 
JEL:  C14 C33 C38 J31 
Date:  2017–12 
URL:  http://d.repec.org/n?u=RePEc:spo:wpmain:info:hdl:2441/lpag9391598uoauqu4u9opq76&r=ecm 