nep-ecm New Economics Papers
on Econometrics
Issue of 2016‒03‒29
seventeen papers chosen by
Sune Karlsson
Örebro universitet

  1. ESTIMATION OF STAR-GARCH MODELS WITH ITERATIVELY WEIGHTED LEAST SQUARES By Murat Midilic
  2. Estimation of Nonlinear Panel Models with Multiple Unobserved Effects By Chen , Mingli
  3. Unbiased estimation of risk By Marcin Pitera; Thorsten Schmidt
  4. Moment Inequalities for Multinomial Choice with Fixed Effects By Ariel Pakes; Jack Porter
  5. Order Invariant Evaluation of Multivariate Density Forecasts By Dovern, Jonas; Manner, Hans
  6. A Method for Measuring Treatment Effects on the Treated without Randomization By P. A. V. B. Swamya; S. G. Hall; G. S. Tavlas; I. Chang; H. D. Gibson; W. H. Greene; J. S. Mehta
  7. Estimating Quantile Families of Loss Distributions for Non-Life Insurance Modelling via L-moments By Gareth W. Peters; Wilson Y. Chen; Richard H. Gerlach
  8. State Correlation and Forecasting: A Bayesian Approach Using Unobserved Components Models By Luis Uzeda
  9. Property Price Index Theory and Estimation: A Survey By Shimizu, Chihiro; Karato, Koji
  10. Penalized functional spatial regression By Aguilera, Ana M.; Durbán, María; Aguilera-Morillo, M. Carmen
  11. Root-N consistent estimations of time dummies for the dynamic fixed effects logit models: Monte Carlo illustrations By Yoshitsugu Kitazawa
  12. Changing dynamics at the zero lower bound By Gregor Bäurle; Daniel Kaufmann; Sylvia Kaufmann; Rodney W. Strachan
  13. Using Lagged Outcomes to Evaluate Bias in Value-Added Models By Raj Chetty; John N. Friedman; Jonah Rockoff
  14. Clustering Financial Time Series: How Long is Enough? By Gautier Marti; S\'ebastien Andler; Frank Nielsen; Philippe Donnat
  15. Dynamic Adaptive Mixture Models By Leopoldo Catania
  16. Understanding peer effects : on the nature, estimation and channels of peer effects By Feld J.F.; Zölitz U.N.
  17. Macroeconomic Regimes and Regime Shifts By James D. Hamilton

  1. By: Murat Midilic (-)
    Abstract: This study applies the Iteratively Weighted Least Squares (IWLS) algorithm to a Smooth Transition Autoregressive (STAR) model with conditional variance. Monte Carlo simulations are performed to measure the performance of the algorithm, to compare its performance with the performances of established methods in the literature, and to see the effect of initial value selection method. Simulation results show that low bias and mean squared error are received for the slope parameter estimator from the IWLS algorithm when the real value of the slope parameter is low. In an empirical illustration, STAR-GARCH model is used to forecast daily US Dollar/Australian Dollar and FTSE Small Cap index returns. 1-day ahead out-of-sample forecast results show that forecast performance of the STAR-GARCH model improves with the IWLS algorithm and the model performs better that the benchmark model.
    Keywords: STAR, GARCH, iteratively weighted least squares, Australian Dollar,FTSE
    JEL: C15 C51 C53 C58 C87 F31
    Date: 2016–01
    URL: http://d.repec.org/n?u=RePEc:rug:rugwps:16/918&r=ecm
  2. By: Chen , Mingli (Department of Economics, University of Warwick)
    Abstract: I propose a fixed effects expectation-maximization (EM) estimator that can be applied to a class of nonlinear panel data models with unobserved heterogeneity, which is modeled as individual effects and/or time effects. Of particular interest is the case of interactive effects, i.e. when the unobserved heterogeneity is modeled as a factor analytical structure. The estimator is obtained through a computationally simple, iterative two-step procedure, where the two steps have closed form solutions. I show that estimator is consistent in large panels and derive the asymptotic distribution for the case of the probit with interactive effects. I develop analytical bias corrections to deal with the incidental parameter problem. Monte Carlo experiments demonstrate that the proposed estimator has good finite-sample properties.
    Keywords: Nonlinear panel, latent variables, interactive effects, factor error structure, EM algorithm, incidental parameters, bias correction
    JEL: C13 C21 C22
    Date: 2016
    URL: http://d.repec.org/n?u=RePEc:wrk:warwec:1120&r=ecm
  3. By: Marcin Pitera; Thorsten Schmidt
    Abstract: The estimation of risk measured in terms of a risk measure is typically done in two steps: in the first step, the distribution is estimated by statistical methods, either parametric or non-parametric. In the second step, the estimated distribution is considered as true distribution and the targeted risk-measure is computed. In the parametric case this is achieved by using the formula for the risk-measure in the model and inserting the estimated parameters. It is well-known that this procedure is not efficient because the highly nonlinear mapping from model parameters to the risk-measure introduces an additional biases. Statistical experiments show that this bias leads to a systematic underestimation of risk. In this regard we introduce the concept of unbiasedness to the estimation of risk. We show that an appropriate bias correction is available for many well known estimators. In particular, we consider value-at-risk and tail value-at-risk (expected shortfall). In the special case of normal distributions, closed-formed solutions for unbiased estimators are given. For the general case we propose a bootstrapping algorithm and illustrate the outcomes by several data experiments.
    Date: 2016–03
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1603.02615&r=ecm
  4. By: Ariel Pakes; Jack Porter
    Abstract: We propose a new approach to semiparametric analysis of multinomial choice models with fixed effects and a group (or panel) structure. A traditional random utility framework is employed, and the key assumption is a group homogeneity condition on the disturbances. This assumption places no restrictions on either the joint distribution of the disturbances across choices or within group (or across time) correlations. This work follows a substantial nonlinear panel literature (Manski 1987, Honore 1992, Abrevaya 1999, 2000) with the distinction that multiple covariate index functions now determine the outcome. A novel within-group comparison leads to a set of conditional moment inequalities that provide partial identifying information about the parameters of the observed covariate index functions, while avoiding the incidental parameter problem. We extend our framework to allow for: certain types of endogenous regressors (including lagged dependent variables and conditional heteroskedasticity), set-valued covariates, and parametric distributional information on disturbances.
    JEL: C14 C23 C25
    Date: 2016–01
    URL: http://d.repec.org/n?u=RePEc:nbr:nberwo:21893&r=ecm
  5. By: Dovern, Jonas; Manner, Hans
    Abstract: We derive new tests for proper calibration of multivariate density forecasts based on Rosenblatt probability integral transforms. These tests have the advantage that they i) do not depend on the ordering of variables in the forecasting model, ii) are applicable to densities of arbitrary dimensions, and iii) have superior power relative to existing approaches. We furthermore develop adjusted tests that allow for estimated parameters and, consequently, can be used as in-sample specification tests. We demonstrate the problems of existing tests and how our new approaches can overcome those using Monte Carlo Simulation as well as two applications based on multivariate GARCH-based models for stock market returns and on a macroeconomic Bayesian vectorautoregressive model.
    Keywords: density calibration; goodness-of-fit test; predictive density; Rosenblatt transformation
    Date: 2016–03–08
    URL: http://d.repec.org/n?u=RePEc:awi:wpaper:0608&r=ecm
  6. By: P. A. V. B. Swamya; S. G. Hall; G. S. Tavlas; I. Chang; H. D. Gibson; W. H. Greene; J. S. Mehta
    Abstract: This paper contributes to the literature on the estimation of causal effects by providing an analytical formula for individual specific treatment effects and an empirical methodology that allows us to estimate these effects. We derive the formula from a general model with minimal restrictions, unknown functional form and true unobserved variables such that it is a credible model of the underlying real world relationship. Subsequently, we manipulate the model in order to put it in an estimable form. In contrast to other empirical methodologies, which derive average treatment effects, we derive an analytical formula that provides estimates of treatment effects on each treated individual. We also provide an empirical example that illustrates our methodology.
    Keywords: Causality, Real-world relationship, Unique error term, Treatment effect, Non-experimental situation
    JEL: C13 C51
    Date: 2016–03
    URL: http://d.repec.org/n?u=RePEc:lec:leecon:16/02&r=ecm
  7. By: Gareth W. Peters; Wilson Y. Chen; Richard H. Gerlach
    Abstract: This paper discusses different classes of loss models in non-life insurance settings. It then overviews the class Tukey transform loss models that have not yet been widely considered in non-life insurance modelling, but offer opportunities to produce flexible skewness and kurtosis features often required in loss modelling. In addition, these loss models admit explicit quantile specifications which make them directly relevant for quantile based risk measure calculations. We detail various parameterizations and sub-families of the Tukey transform based models, such as the g-and-h, g-and-k and g-and-j models, including their properties of relevance to loss modelling. One of the challenges with such models is to perform robust estimation for the loss model parameters that will be amenable to practitioners when fitting such models. In this paper we develop a novel, efficient and robust estimation procedure for estimation of model parameters in this family Tukey transform models, based on L-moments. It is shown to be more robust and efficient than current state of the art methods of estimation for such families of loss models and is simple to implement for practical purposes.
    Date: 2016–03
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1603.01041&r=ecm
  8. By: Luis Uzeda
    Abstract: Implications to signal extraction that arise from specifying unobserved components (UC) models with correlated or orthogonal innovations have been well-investigated. In contrast, an analogous statement for forecasting evaluation cannot be made. This paper attempts to fill this gap in light of the recent resurgence of studies adopting UC models for forecasting purposes. In particular, four correlation structures are entertained: orthogonal, correlated, perfectly correlated innovations as well as a novel approach which combines features from two contrasting cases, namely, orthogonal and perfectly correlated innovations. Parameter space restrictions associated with different correlation structures and their connection with forecasting are discussed within a Bayesian framework. Introducing perfectly correlated innovations, however, reduces the covariance matrix rank. To accommodate that, a Markov Chain Monte Carlo sampler which builds upon properties of Toeplitz matrices and recent advances in precision-based algorithms is developed. Our results for several measures of U.S. inflation indicate that the correlation structure between state variables has important implications for forecasting performance as well as estimates of trend inflation.
    Keywords: Bayesian, Markov Chain Monte Carlo, State Space, Unobserved ComponentsModels, ARIMA, Reduced Rank, Precision, Forecasting
    JEL: C11 C15 C51 C53
    Date: 2016–03
    URL: http://d.repec.org/n?u=RePEc:acb:cbeeco:2016-632&r=ecm
  9. By: Shimizu, Chihiro; Karato, Koji
    Abstract: Property has the particularity of being a non-homogeneous good, and based on this, it is necessary to perform quality adjustment when estimating property price indexes. Various methods of quality adjustment have been proposed and applied, such as the hedonic method often used in price statistics and, due to the fact that the information that can be used in estimation is limited, the repeat sales price method, methods using property appraisal price information, and so forth. However, since there is a lack of theoretical knowledge and data restrictions, it is no exaggeration to say that it is difficult to evaluate their practical application in the present situation. Therefore, focusing on the hedonic method that has been proposed as a quality adjustment method for property price indexes, in addition to repeat sales price indexes and indexes employing property appraisal prices, this paper aimed to outline the underlying econometric theory and clarify the advantages and disadvantages of the respective estimation methods.
    Keywords: Hedonic price index, Repeat sales price index, Age effect, Hybrid method, Property appraisal price method, SPAR
    JEL: C2 C23 C43 D12 E31 R21
    Date: 2016–02
    URL: http://d.repec.org/n?u=RePEc:hit:remfce:34&r=ecm
  10. By: Aguilera, Ana M.; Durbán, María; Aguilera-Morillo, M. Carmen
    Abstract: This paper is focus on spatial functional variables whose observa- tions are realizations of a spatio-temporal functional process. In this context, a new smoothing method for functional data presenting spa- tial dependence is proposed. This approach is based on a P-spline estimation of a functional spatial regression model. As alternative to other geostatistical smoothing methods (kriging and kernel smooth- ing, among others), the proposed P-spline approach can be used to estimate the functional form of a set of sample paths observed only at a finite set of time points, and also to predict the corresponding func- tional variable at a new location within the plane of study. In order to test the good performance of the proposed method, two simulation studies and an application with real data will be developed and the results will be compared with functional kriging.
    Keywords: P-splines; Functional spatial regression; Functional data
    Date: 2015–06–18
    URL: http://d.repec.org/n?u=RePEc:cte:wsrepe:21206&r=ecm
  11. By: Yoshitsugu Kitazawa (Faculty of Economics, Kyushu Sangyo University)
    Abstract: This paper illustrates the feasibility of the root-N consistent estimations of time dummies for both dynamic fixed effects logit models with strictly exogenous continuous explanatory variables and with no explanatory variable by using some Monte Carlo experiments. The illustrations not only imply the direct rebuttal to the generalization of Hahn fs (2001) suggestion, but also pave the way for fathoming the time effects in dynamic binary choice panel data models in a breeze.
    Keywords: dynamic fixed effects logit models; time dummies; root-N consistent GMM estimators; Monte Carlo
    JEL: C23 C25
    Date: 2016–03
    URL: http://d.repec.org/n?u=RePEc:kyu:dpaper:72&r=ecm
  12. By: Gregor Bäurle (Swiss National Bank); Daniel Kaufmann (KOF ETH Zurich); Sylvia Kaufmann (Study Center Gerzensee); Rodney W. Strachan (The University of Queensland)
    Abstract: The interaction of macroeconomic variables may change as the nominal shortterm interest rates approaches zero. In this paper, we propose an empirical model capturing these changing dynamics with a time-varying parameter vector autoregressive process. State-dependent parameters are determined by a latent state indicator, of which the probability distribution is itself affected by the lagged interest rate. As the interest rate enters the critical zero lower bound (ZLB) region, dynamics between variables and the effect of shocks change. We estimate the model with Bayesian methods and take explicitly into account that the interest rate might be constrained in the ZLB region. We provide an estimate of the latent rate, i.e. the lower interest rate than the observed level which would be state- and modelconsistent. The endogenous specification of the state indicator permits dynamic forecasts of the state and the system variables. In an application to Swiss data, we evaluate state-dependent impulse-responses to a risk premium shock identified with sign-restrictions. Additionally, we discuss scenario-based forecasts and evaluate the probability of the system exiting the ZLB region based on the inherent dynamics only.
    Date: 2016–01
    URL: http://d.repec.org/n?u=RePEc:szg:worpap:1602&r=ecm
  13. By: Raj Chetty; John N. Friedman; Jonah Rockoff
    Abstract: Value-added (VA) models measure the productivity of agents such as teachers or doctors based on the outcomes they produce. The utility of VA models for performance evaluation depends on the extent to which VA estimates are biased by selection, for instance by differences in the abilities of students assigned to teachers. One widely used approach for evaluating bias in VA is to test for balance in lagged values of the outcome, based on the intuition that today’s inputs cannot influence yesterday’s outcomes. We use Monte Carlo simulations to show that, unlike in conventional treatment effect analyses, tests for balance using lagged outcomes do not provide robust information about the degree of bias in value-added models for two reasons. First, the treatment itself (value-added) is estimated, rather than exogenously observed. As a result, correlated shocks to outcomes can induce correlations between current VA estimates and lagged outcomes that are sensitive to model specification. Second, in most VA applications, estimation error does not vanish asymptotically because sample sizes per teacher (or principal, manager, etc.) remain small, making balance tests sensitive to the specification of the error structure even in large datasets. We conclude that bias in VA models is better evaluated using techniques that are less sensitive to model specification, such as randomized experiments, rather than using lagged outcomes.
    JEL: C18 H75 I21 J01 J08 J45 M50 M54
    Date: 2016–02
    URL: http://d.repec.org/n?u=RePEc:nbr:nberwo:21961&r=ecm
  14. By: Gautier Marti; S\'ebastien Andler; Frank Nielsen; Philippe Donnat
    Abstract: Researchers have used from 30 days to several years of daily returns as source data for clustering financial time series based on their correlations. This paper sets up a statistical framework to study the validity of such practices. We first show that clustering correlated random variables from their observed values is statistically consistent. Then, we also give a first empirical answer to the much debated question: How long should the time series be? If too short, the clusters found can be spurious; if too long, dynamics can be smoothed out.
    Date: 2016–03
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1603.04017&r=ecm
  15. By: Leopoldo Catania
    Abstract: In this paper we propose a new class of Dynamic Mixture Models (DAMMs) being able to sequentially adapt the mixture components as well as the mixture composition using information coming from the data. The information driven nature of the proposed class of models allows to exactly compute the full likelihood and to avoid computer intensive simulation schemes. An extensive Monte Carlo experiment reveals that the new proposed model can accurately approximate the more complicated Stochastic Dynamic Mixture Model previously introduced in the literature as well as other kind of models. The properties of the new proposed class of models are discussed through the paper and an application in financial econometrics is reported.
    Date: 2016–03
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1603.01308&r=ecm
  16. By: Feld J.F.; Zölitz U.N. (ROA)
    Abstract: This paper estimates peer effects in a university context where students are randomly assigned to sections. While students benefit from better peers on average, lowachieving students are harmed by high-achieving peers. Analyzing students course evaluations suggests that peer effects are driven by improved group interaction rather than adjustments in teachers behavior or students effort. We further show, building on Angrist 2014, that classical measurement error in a setting where group assignment is systematic can lead to substantial overestimation of peer effects. With random assignment, as is the case in our setting, estimates are only attenuated.
    Keywords: Analysis of Education; Education and Inequality; Human Capital; Skills; Occupational Choice; Labor Productivity;
    JEL: I21 I24 J24
    Date: 2016
    URL: http://d.repec.org/n?u=RePEc:unm:umaror:2016001&r=ecm
  17. By: James D. Hamilton
    Abstract: Many economic time series exhibit dramatic breaks associated with events such as economic recessions, financial panics, and currency crises. Such changes in regime may arise from tipping points or other nonlinear dynamics and are core to some of the most important questions in macroeconomics. This paper surveys the literature for studying regime changes and summarizes available methods. Section 1 introduces some of the basic tools for analyzing such phenomena, using for illustration the move of an economy into and out of recession. Section 2 focuses on empirical methods, providing a detailed overview of econometric analysis of time series that are subject to changes in regime. Section 3 discusses theoretical treatment of macroeconomic models with changes in regime and reviews applications in a number of areas of macroeconomics. Some brief concluding recommendations for applied researchers are offered in Section 4.
    JEL: C32 E32 E37
    Date: 2016–01
    URL: http://d.repec.org/n?u=RePEc:nbr:nberwo:21863&r=ecm

This nep-ecm issue is ©2016 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.