
on Econometrics 
By:  Antoch, Jaromir; Hanousek, Jan; Horvath, Lajos; Huskova, Marie; Wang, Shixuan 
Abstract:  The detection of the (structural) break or so called change point problem has drawn increasing attention from both theoretical and applied economic and financial research over the last decade. A large part of the existing research concentrates on the detection and asymptotic properties of the change point problem for panels with a large time dimension T. In this article we study a different approach, i.e., we consider the asymptotic properties with respect to N (number of panel members) while keeping T fixed. This situation (N ? 8 but T being fixed and rather small) is typically related to large (firmlevel) data containing financial information about an immerse number of firms/stocks across a limited number of years/quarters/months. We propose a general approach for testing for the break(s) in this setup, which also allows their detection. In particular, we show the asymptotic behavior of the test statistics, along with an alternative wild bootstrap procedure that could be used to generate the critical values of the test statistics. The theoretical approach is supplemented by numerous simulations and extended by an empirical illustration. In the practical application we demonstrate the testing procedure in the framework of the four factors CAPM model. In particular, we estimate breaks in monthly returns of the US mutual funds during the period January 2006 to February 2010 which covers the subprime crises. 
Keywords:  Change point problem; stationarity; panel data; bootstrap; four factor CAPM model; US mutual funds. 
JEL:  C10 C23 C33 
Date:  2017–03 
URL:  http://d.repec.org/n?u=RePEc:cpr:ceprdp:11891&r=ecm 
By:  Bartolucci, Francesco; Pigini, Claudia 
Abstract:  Strict exogeneity of covariates other than the lagged dependent variable, and conditional on unobserved heterogeneity, is often required for consistent estimation of binary panel data models. This assumption is likely to be violated in practice because of feedback effects from the past of the outcome variable on the present value of covariates and no general solution is yet available. In this paper, we provide the conditions for a logit model formulation that takes into account feedback effects without specifying a joint parametric model for the outcome and predetermined explanatory variables. Our formulation is based on the equivalence between Granger's definition of noncausality and a modification of the Sims' strict exogeneity assumption for nonlinear panel data models, introduced by Chamberlain1982 and for which we provide a more general theorem. We further propose estimating the model parameters with a recent fixedeffects approach based on pseudo conditional inference, adapted to the present case, thereby taking care of the correlation between individual permanent unobserved heterogeneity and the model's covariates as well. Our results hold for short panels with a large number of crosssection units, a case of great interest in microeconomic applications. 
Keywords:  fixed effects, noncausality, predetermined covariates, pseudoconditional inference, strict exogeneity 
JEL:  C12 C23 C25 
Date:  2017–03–13 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:77486&r=ecm 
By:  José E. FigueroaLópez (Department of Mathematics, Washington University in St. Louis, MO, 63130, USA); Cecilia Mancini (Department of Management and Economics, University of Florence, via delle Pandette 9, 50127, Italy) 
Abstract:  We consider a univariate semimartingale model for (the logarithm of) an asset price, containing jumps having possibly infinite activity (IA). The nonparametric threshold estimator $\hat{IV}_n$ of the integrated variance $IV:=\int_0^T\sigma_s^2ds$ proposed in [6] is constructed using observations on a discrete time grid, and precisely it sums up the squared increments of the process when they are under a threshold, a deterministic function of the observation step and possibly of the coefficients of X. All the threshold functions satisfying given conditions allow asymptotically consistent estimates of IV , however the finite sample properties of $\hat{IV}_n$ can depend on the specific choice of the threshold. We aim here at optimally selecting the threshold by minimizing either the estimation mean square error (MSE) or the conditional mean square error (cMSE). The last criterion allows to reach a threshold which is optimal not in mean but for the specific path at hand. A parsimonious characterization of the optimum is established, which turns out to be asymptotically pro portional to the Lévy’s modulus of continuity of the underlying Brownian motion. Moreover, minimizing the cMSE enables us to propose a novel implementation scheme for the optimal threshold sequence. Monte Carlo simulations illustrate the superior performance of the proposed method. 
Keywords:  threshold estimator, integrated variance, Lévy jumps, mean square error, conditional mean square error, modulus of continuity of the Brownian motion paths, numerical scheme 
JEL:  C6 C13 
Date:  2017–03 
URL:  http://d.repec.org/n?u=RePEc:flo:wpaper:201701&r=ecm 
By:  Hacène Djellout (LMBP  Laboratoire de Mathématiques Blaise Pascal  UBP  Université Blaise Pascal  ClermontFerrand 2  CNRS  Centre National de la Recherche Scientifique); Hui Jiang (Nanjing University of Aeronautics and Astronautics  Department of Mathematics) 
Abstract:  Recently a considerable interest has been paid on the estimation problem of the realized volatility and covolatility by using highfrequency data of financial price processes in financial econometrics. Threshold estimation is one of the useful techniques in the inference for jumptype stochastic processes from discrete observations. In this paper, we adopt the threshold estimator introduced by Mancini where only the variations under a given threshold function are taken into account. The purpose of this work is to investigate large and moderate deviations for the threshold estimator of the integrated variancecovariance vector. This paper is an extension of the previous work in Djellout Guillin and Samoura where the problem has been studied in absence of the jump component. We will use the approximation lemma to prove the LDP. As the reader can expect we obtain the same results as in the case without jump. 
Keywords:  Jump Poisson,Large deviation principle,Quadratic variation,Threshold estimator 
Date:  2017–03–19 
URL:  http://d.repec.org/n?u=RePEc:hal:journl:hal01147189&r=ecm 
By:  Huber, Martin; Steinmayr, Andreas 
Abstract:  This paper suggests a causal framework for disentangling individual level treatment effects and interference effects, i.e., general equilibrium, spillover, or interaction effects related to treatment distribution. Thus, the framework allows for a relaxation of the Stable Unit Treatment Value Assumption (SUTVA), which assumes away any form of treatmentdependent interference between study participants. Instead, we permit interference effects within aggregate units, for example, regions or local labor markets, but need to rule out interference effects between these aggregate units. Borrowing notation from the causal mediation literature, we define a range of policyrelevant effects and formally discuss identification based on randomization, selection on observables, and differenceindifferences. We also present an application to a policy intervention extending unemployment benefit durations in selected regions of Austria that arguably affected ineligibles in treated regions through general equilibrium effects in local labor markets. 
Keywords:  treatment effect; general equilibrium effects; spillover effects; interaction effects; interference effects; inverse probability; weighting; propensity score; mediation analysis; differenceindifferences 
JEL:  C21 C31 
Date:  2017–03–23 
URL:  http://d.repec.org/n?u=RePEc:fri:fribow:fribow00481&r=ecm 
By:  Scott French (School of Economics, UNSW Business School, UNSW) 
Abstract:  Gravity estimation based on sectorlevel trade data is generally misspecified because it ignores the role of productlevel comparative advantage in shaping the effects of trade barriers on sectorlevel trade flows. Using a model that allows for arbitrary patterns of productlevel comparative advantage, I show that sectorlevel trade flows follow a generalized gravity equation that contains an unobservable, bilateral component that is correlated with trade costs and omitted by standard sectorlevel gravity models. I propose and implement an estimator that uses productlevel data to account for patterns of comparative advantage and find the bias in sectorlevel estimates to be significant. I also find that, when controlling for productlevel comparative advantage, estimates are much more robust to distributional assumptions, suggesting that remaining biases due to heteroskedasticity and sample selection are less severe than previously thought. 
Keywords:  international trade, productlevel, misspecification, heteroskedasticity, multisector 
JEL:  F10 F14 C13 C21 C50 
Date:  2017–01 
URL:  http://d.repec.org/n?u=RePEc:swe:wpaper:201703&r=ecm 
By:  Egger, Peter; Tarlea, Filip 
Abstract:  Trade and trade policy such as the membership in preferential economic integration agreements (PEIAs; e.g., customs unions or freetrade areas) are jointly determined by the same factors. Therefore, work on the causal effects of trade policy on trade relies on the selection on observables, with propensityscore matching being the leading example. Conditional on some compact metric (the score) of observable joint determinants of PEIAs and trade flows, the causal average partial effect of PEIAs on trade is obtained from a simple (weighted) mean comparison of trade flows between members and nonmembers. A key prerequisite for this approach to obtain consistent estimates is that the score is balanced: similarity of country pairs in the score (the propensity of PEIA membership) means similarity in each and everyone of the observables behind it. Otherwise the effect estimates may be biased, and one would misascribe nonparametric effects of differences in individual observables to PEIA membership. We demonstrate that there is a severe upward bias of PEIA effects on trade flows from lack of covariate balancing in realworld data, employ a remedy of this bias through entropy balancing, and quantify the bias for partial as well as generalequilibrium effects. 
Keywords:  Balancing property; Causal effects; Entropy balancing; Gravity models; Preferential economic integration agreements; Propensity score estimation; Weighting regression 
JEL:  F13 F14 F15 
Date:  2017–03 
URL:  http://d.repec.org/n?u=RePEc:cpr:ceprdp:11894&r=ecm 
By:  Belen Garcia Carceles; Belén García Cárceles; Bernardí Cabrer Borrás; Jose Manuel Pavía Miralles 
Abstract:  Time series modeling by the use of automatic signal extraction methods has been widely studied and used in different contexts of economic analysis. The methodological innovation of ARIMA / SARIMA models estimation made significant contributions to the understanding of temporal dynamics of events, even when the time structure was apparently irregular and unpredictable. The popularity of these models was reflected in the development of applications that implemented algorithms that automaticaly extract temporal patterns of the series and provide a reasonably accurate adjustment by a mathematical model, making it also in a quick and consistent manner. One of the most common use of these programs is in the univariate analysis context, to achieve its filtering for its posterior use in a multivariate structure. However, there is significant untapped potential in the results provided by those applications. In this paper there's a description of the methodology with which the use of TRAMO SEATS and X13 ARIMA is implemented directly in a multivariate structure. Specifically, we have applied data analysis techniques related to artificial neural networks. UNder the neural networks philosophy, events are conceived as linked nodes which activate or not depending on the intensity of an imput signal. At that point come into play STRETCH or X13. To illustrate the methodology and the use of the model, series of healthrelated time are used, and a consistent model able to "react" to the dynamic interrelations of the variables considered is described. Standard panel data modeling is included in the example and compared with the new methodology. 
Keywords:  Spain, Germany, Netherlands, Sweeden, Belgium., Modeling: new developments, Forecasting and projection methods 
Date:  2015–07–01 
URL:  http://d.repec.org/n?u=RePEc:ekd:008007:8669&r=ecm 
By:  Dirk Hoorelbeke 
Abstract:  This paper proposes a bootstrap method to enhance the performance of theinformation matrix test, and more in general of the score test. The informationmatrix test is a model specification test proposed by White (1982). The standardbootstrap method is to use the bootstrap distribution of the test statistic to obtaina critical value which is more accurate than the asymptotic critical value.However, the score test uses a quadratic form statistic. In the construction andimplementation of such a quadratic form statistic two important aspects whichdetermine the performance of the test (both under the null and the alternative),are (i) the weighting matrix (the estimate of the variance matrix) and (ii) thecritical value. In this paper the bootstrap is used to get simultaneously a bettervariance matrix estimate and accurate critical values. The information matrix testis studied in some Monte Carlo experiments. see above see above 
Keywords:  N/A, Other issues, Forecasting and projection methods 
Date:  2015–07–01 
URL:  http://d.repec.org/n?u=RePEc:ekd:008007:8265&r=ecm 
By:  Areski Cousin (SAF  Laboratoire de Sciences Actuarielle et Financière  UCBL  Université Claude Bernard Lyon 1); Hassan Maatouk (GdR MASCOTNUM  Méthodes d'Analyse Stochastique des Codes et Traitements Numériques  CNRS  Centre National de la Recherche Scientifique, LIMOS  Laboratoire d'Informatique, de Modélisation et d'optimisation des Systèmes  UBP  Université Blaise Pascal  ClermontFerrand 2  UdA  Université d'Auvergne  ClermontFerrand I  Sigma CLERMONT  Sigma CLERMONT  CNRS  Centre National de la Recherche Scientifique, DEMOENSMSE  Département Décision en Entreprise : Modélisation, Optimisation  Mines SaintÉtienne MSE  École des Mines de SaintÉtienne  Institut MinesTélécom [Paris]  Institut Henri Fayol); Didier Rullière (SAF  Laboratoire de Sciences Actuarielle et Financière  UCBL  Université Claude Bernard Lyon 1) 
Abstract:  Due to the lack of reliable market information, building financial termstructures may be associated with a significant degree of uncertainty. In this paper, we propose a new termstructure interpolation method that extends classical spline techniques by additionally allowing for quantification of uncertainty. The proposed method is based on a generalization of kriging models with linear equality constraints (marketfit conditions) and shapepreserving conditions such as monotonicity or positivity (noarbitrage conditions). We define the most likely curve and show how to build confidence bands. The Gaussian process covariance hyperparameters under the construction constraints are estimated using crossvalidation techniques. Based on observed market quotes at different dates, we demonstrate the efficiency of the method by building curves together with confidence intervals for termstructures of OIS discount rates, of zerocoupon swaps rates and of CDS implied default probabilities. We also show how to construct interestrate surfaces or default probability surfaces by considering time (quotation dates) as an additional dimension. 
Keywords:  OIS discount curve,Model risk,yield curve, noarbitrage constraints, implied default distribution, kriging, interestrate curve 
Date:  2016–06–04 
URL:  http://d.repec.org/n?u=RePEc:hal:journl:hal01206388&r=ecm 