nep-ecm New Economics Papers
on Econometrics
Issue of 2013‒10‒05
sixteen papers chosen by
Sune Karlsson
Orebro University

  1. Bootstrap Tests for Overidentification in Linear Regression Models By Russell Davidson; James G. MacKinnon
  2. Macroeconomic forecasting and structural analysis through regularized reduced-rank regression By Emmanuela Bernardini; Gianluca Cubadda
  3. "Matrix Exponential Stochastic Volatility with Cross Leverage" By Tsunehiro Ishihara; Yasuhiro Omori; Manabu Asai
  4. MEASURE OF LOCATION-BASED ESTIMATORS IN SIMPLE LINEAR REGRESSION By DANIEL PREVE; Shu-Ping XIJIA LIU
  5. Generalised Linear Spectral Models By Tommaso Proietti; Alessandra Luati
  6. Approximate variational inference for a model of social interactions By Angelo Mele
  7. Identifying Genuine Effects in Observational Research by Means of Meta-Regressions By Stephan B. Bruns
  8. Identification in a Generalization of Bivariate Probit Models with Endogenous Regressors By Sukjin Han; Edward J. Vytlacil
  9. Modelling Biased Judgement with Weighted Updating By Zinn, Jesse
  10. SHARP BOUNDS ON TREATMENT EFFECTS IN A BINARY TRIANGULAR SYSTEM By Ismael MOURIFIÉ
  11. Causal Analysis after Haavelmo By Heckman, James J.; Pinto, Rodrigo
  12. Parameter Identification in the Logistic STAR Model By Line Elvstrøm Ekner; Emil Nejstgaard
  13. Estimation of worker and firm effects with censored data By Yolanda F. Rebollo-Sanz; Ainara González de San Román
  14. My Friend Far Far Away: Asymptotic Properties of Pairwise Stable Networks By Vincent BOUCHER; Ismael MOURIFIÉ
  15. Spot-forward Model for Electricity Prices By Stein-Erik, Fleten; Paraschiv, Florentina; Schürle, Michel
  16. The performance of bid-ask spread estimators under less than ideal conditions By Michael Bleaney; Zhiyong Li

  1. By: Russell Davidson (McGill University); James G. MacKinnon (Queen's University)
    Abstract: Despite much recent work on the finite-sample properties of estimators and tests for linear regression models with a single endogenous regressor and weak instruments, little attention has been paid to tests for overidentifying restrictions in these circumstances. We study asymptotic tests for overidentification in models estimated by instrumental variables and by limited-information maximum likelihood. We show that all the test statistics, like the ones used for inference on the regression coefficient, are functions of only six quadratic forms in the two endogenous variables of the model. They are closely related to the well-known test statistic of Anderson and Rubin. The distributions of the overidentification statistics are shown to have an ill-defined limit as the strength of the instruments tends to zero along with a parameter related to the correlation between the disturbances of the two equations of the model. Simulation experiments demonstrate that this makes it impossible to perform reliable inference near the point at which the limit is ill-defined. Several bootstrap procedures are proposed. They alleviate the problem and allow reliable inference when the instruments are not too weak.
    Keywords: Sargan test, Basmann test, Anderson-Rubin test, weak instruments, bootstrap P value
    JEL: C10 C12 C15 C30
    Date: 2013–09
    URL: http://d.repec.org/n?u=RePEc:qed:wpaper:1318&r=ecm
  2. By: Emmanuela Bernardini (Banca d'Italia); Gianluca Cubadda (University of Rome "Tor Vergata")
    Abstract: This paper proposes a strategy to detect and impose reduced-rank restrictions in medium vector autoregressive models. In this framework, it is known that Canonical Correlation Analysis (CCA) does not perform well because inversions of large covariance matrices are required. We propose a method that combines the richness of reduced-rank regression with the simplicity of naive univariate forecasting methods. In particular, we suggest to use a proper shrinkage estimator of the autocovariance matrices that are involved in the computation of CCA, thus obtaining a method that is asymptotically equivalent to CCA, but it is numerically more stable in finite samples. Simulations and empirical applications document the merits of the proposed approach both in forecasting and in structural analysis.
    Keywords: Reduced rank regression; vector autoregressive models; shrinkage estimation; macroeconomic forecasting.
    JEL: C32
    Date: 2013–10–03
    URL: http://d.repec.org/n?u=RePEc:rtv:ceisrp:289&r=ecm
  3. By: Tsunehiro Ishihara (Department of Economics, Hitotsubashi University,); Yasuhiro Omori (Faculty of Economics, University of Tokyo); Manabu Asai (Faculty of Economics, Soka University,)
    Abstract:    A multivariate stochastic volatility model with the dynamic correlation and the cross leverage effect is described and its efficient estimation method using Markov chain Monte Carlo is proposed. The time-varying covariance matrices are guaranteed to be positive denite by using a matrix exponential transformation. Of particular interest is our approach for sampling a set of latent matrix logarithm variables from their con- ditional posterior distribution, where we construct the proposal density based on an approximating linear Gaussian state space model. The proposed model and its extend- ed models with fat-tailed error distribution are applied to trivariate returns data (daily stocks, bonds, and exchange rates) of Japan. Further, a model comparison is conducted including constant correlation multivariate stochastic volatility models with leverage.
    Date: 2013–09
    URL: http://d.repec.org/n?u=RePEc:tky:fseres:2013cf904&r=ecm
  4. By: DANIEL PREVE (City University of Hong Kong); Shu-Ping XIJIA LIU (Uppsala University)
    Abstract: In this note we consider certain measure of location-based estimators (MLBEs) for the slope parameter in a linear regression model with a single stochastic regressor. The medianunbiased MLBEs are interesting as they can be robust to heavy-tailed samples and, hence, preferable to the ordinary least squares estimator (LSE). Two dierent cases are considered as we investigate the statistical properties of the MLBEs. In the rst case, the regressor and error is assumed to follow a symmetric stable distribution. In the second, other types of regressions, with potentially contaminated errors, are considered. For both cases the consistency and exact nitesample distributions of the MLBEs are established. Some results for the corresponding limiting distributions are also provided. In addition, we illustrate how our results can be extended to include certain heteroskedastic and multiple regressions. Finite-sample properties of the MLBEs in comparison to the LSE are investigated in a simulation study
    Date: 2013–08
    URL: http://d.repec.org/n?u=RePEc:skb:wpaper:cofie-02-2013&r=ecm
  5. By: Tommaso Proietti (University of Rome "Tor Vergata"); Alessandra Luati (University of Bologna)
    Abstract: In this chapter we consider a class of parametric spectrum estimators based on a generalized linear model for exponential random variables with power link. The power transformation of the spectrum of a stationary process can be expanded in a Fourier series, with the coefficients representing generalised autocovariances. Direct Whittle estimation of the coefficients is generally unfeasible, as they are subject to constraints (the autocovariances need to be a positive semidefinite sequence). The problem can be overcome by using an ARMA representation for the power transformation of the spectrum. Estimation is carried out by maximising the Whittle likelihood, whereas the selection of a spectral model, as a function of the power transformation parameter and the ARMA orders, can be carried out by information criteria. The proposed methods are applied to the estimation of the inverse autocorrelation function and the related problem of selecting the optimal interpolator, and for the identification of spectral peaks. More generally, they can be applied to spectral estimation with possibly misspecified models.
    Date: 2013–10–03
    URL: http://d.repec.org/n?u=RePEc:rtv:ceisrp:290&r=ecm
  6. By: Angelo Mele (Johns Hopkins University - Carey Business School)
    Abstract: This paper proposes approximate variational inference methods for estimation of a strategic model of social interactions. Players interact in an exogenous network and sequentially choose a binary action. The utility of an action is a function of the choices of neighbors in the network. I prove that the interaction process can be represented as a potential game and it converges to a unique stationary equilibrium distribution. However, exact inference for this model is infeasible because of a computationally intractable likelihood, which cannot be evaluated even when there are few players. To overcome this problem, I propose variational approximations for the likelihood that allow approximate inference. This technique can be applied to any discrete exponential family, and therefore it is a general tool for inference in models with a large number of players. The methodology is illustrated with several simulated datasets and compared with MCMC methods.
    Keywords: Variational approximations, Bayesian Estimation, Social Interactions
    JEL: D85 C13 C73
    Date: 2013–09
    URL: http://d.repec.org/n?u=RePEc:net:wpaper:1316&r=ecm
  7. By: Stephan B. Bruns (Max Planck Institute of Economics, Jena)
    Abstract: Meta-regression models are increasingly utilized to integrate empirical results across studies while controlling for the potential threats of data-mining and publication bias. We propose extended meta-regression models and evaluate their performance in identifying genuine em- pirical effects by means of a comprehensive simulation study for various scenarios that are prevalent in empirical economics. We can show that the meta-regression models here pro- posed systematically outperform the prior gold standard of meta-regression analysis of re- gression coefficients. Most meta-regression models are robust to the presence of publication bias, but data-mining bias leads to seriously inflated type I errors and has to be addressed explicitly.
    Keywords: Meta-regression, meta-analysis, publication bias, data mining, Monte Carlo simulatio
    JEL: C12 C15 C40
    Date: 2013–09–27
    URL: http://d.repec.org/n?u=RePEc:jrp:jrpwrp:2013-040&r=ecm
  8. By: Sukjin Han (Department of Economics, University of Texas at Austin); Edward J. Vytlacil (Department of Economics, New York University)
    Abstract: This paper provides identification results for a class of models specified by a triangular system of two equations with binary endogenous variables. The joint distribution of the latent error terms is specified through a parametric copula structure, including the normal copula as a special case, while the marginal distributions of the latent error terms are allowed to be arbitrary but known. This class of models includes bivariate probit models as a special case. The paper demonstrates that an exclusion restriction is necessary and sufficient for globally identification of the model parameters with the excluded variable allowed to be binary. Based on this result, identification is achieved in a full model where common exogenous regressors that are present in both equations and excluded instruments are possibly more general than discretely distributed.
    Keywords: Identification, triangular threshold crossing model, bivariate probit model, endogenous variables, binary response, copula, exclusion restriction
    JEL: C35 C36
    Date: 2013–09
    URL: http://d.repec.org/n?u=RePEc:tex:wpaper:130908&r=ecm
  9. By: Zinn, Jesse
    Abstract: The weighted updating model is a generalization of Bayesian updating that allows for biased beliefs by weighting the functions that constitute Bayes' rule with real exponents. I provide an axiomatic basis for this framework and show that weighting a distribution affects the information entropy of the resulting distribution. This result provides the interpretation that weighted updating models biases in which individuals mistake the information content of data. I augment the base model in two ways, allowing it to account for additional biases. The first augmentation allows for discrimination between data. The second allows the weights to vary over time. I also find a set of sufficient conditions for the uniqueness of parameter estimation through maximum likelihood, with log-concavity playing a key role. An application shows that self attribution bias can lead to optimism bias.
    Keywords: Bayesian Updating, Cognative Biases, Learning, Uncertainty
    JEL: C02 D03
    Date: 2013–09–30
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:50310&r=ecm
  10. By: Ismael MOURIFIÉ
    Abstract: This paper considers the evaluation of the average treatment effect (ATE) in a triangular system with binary dependent variables. I impose a threshold crossing model on both endogenous regressor and the outcome. No parametric functional form or distributional assumptions are imposed. Shaikh and Vytlacil (2011, SV) proposed bounds on ATE which are sharp only under a restrictive condition on the support of the covariates and the instruments, which rules out a wide range of models and many relevant applications. In some cases, when SV's support condition fails, their bounds retrieve the same empirical content as the model with unrestricted endogenous regressor. In this setting, I provide a methodology which allows to construct sharp bounds on the ATE by efficiently using variation on covariates without imposing support restrictions.
    Keywords: partial identification, threshold crossing model, triangular system, average treatment effect, endogeneity, program social evaluation.
    JEL: C14 C31 C35
    Date: 2013–10–01
    URL: http://d.repec.org/n?u=RePEc:tor:tecipa:tecipa-498&r=ecm
  11. By: Heckman, James J. (University of Chicago); Pinto, Rodrigo (University of Chicago)
    Abstract: Haavelmo's seminal 1943 paper is the first rigorous treatment of causality. In it, he distinguished the definition of causal parameters from their identification. He showed that causal parameters are de fined using hypothetical models that assign variation to some of the inputs determining outcomes while holding all other inputs fixed. He thus formalized and made operational Marshall's (1890) ceteris paribus analysis. We embed Haavelmo's framework into the recursive framework of Directed Acyclic Graphs (DAG) used in one influential recent approach to causality (Pearl, 2000) and in the related literature on Bayesian nets (Lauritzen, 1996). We compare an approach based on Haavelmo's methodology with a standard approach in the causal literature of DAGs – the "do-calculus" of Pearl (2009). We discuss the limitations of DAGs and in particular of the do-calculus of Pearl in securing identification of economic models. We extend our framework to consider models for simultaneous causality, a central contribution of Haavelmo (1944). In general cases, DAGs cannot be used to analyze models for simultaneous causality, but Haavelmo's approach naturally generalizes to cover it.
    Keywords: causality, identification, do-calculus, directed acyclic graphs, simultaneous treatment effects
    JEL: C10 C18
    Date: 2013–09
    URL: http://d.repec.org/n?u=RePEc:iza:izadps:dp7628&r=ecm
  12. By: Line Elvstrøm Ekner (Department of Economics, Copenhagen University); Emil Nejstgaard (Department of Economics, Copenhagen University)
    Abstract: We propose a new and simple parametrization of the so-called speed of transition parameter of the logistic smooth transition autoregressive (LSTAR) model. The new parametrization highlights that a consequence of the well-known identification problem of the speed of transition parameter is that the threshold autoregression (TAR) is a limiting case of the LSTAR process. We demonstrate how this fact impedes numerical optimization of the original parametrization, whereas this is not the case for the new parametrization. Next, we show that information criteria provide a tool to choose between an LSTAR model and a TAR model; a choice previously basedsolely on economic theory. Reestimation of two published applications illustrate the usefulness of our findings..
    Date: 2013–09–19
    URL: http://d.repec.org/n?u=RePEc:kud:kuiedp:1307&r=ecm
  13. By: Yolanda F. Rebollo-Sanz (Department of Economics, Universidad Pablo de Olavide); Ainara González de San Román (Instituto de Empresa)
    Abstract: The main contribution of this paper is to provide researchers with a new estimation method suitable for censored models with two high dimensional fixed effects. This new estimation method is based on a sequence of least squares regressions. In practice, use of this method can result in significant savings in computing time, and it is applicable to datasets where the number of fixed effects makes standard estimation techniques unfeasible. In addition, the paper both analyses the theoretical properties of the procedure and evaluates its practical performance by means of a Monte Carlo simulation study. Finally, it describes an application to the Spanish economy using a Spanish longitudinal match employer-employee dataset which provides wage information on the working population over a 13-year period. In particular, this paper contributes to the empirical literature on wage determination by providing the first decomposition of individual wages for Spain that takes into account both worker and firm effects after adjusting for censoring. This empirical exercise shows that the biases encountered when censored issues are not taken into account can be of sufficient magnitude as to overestimate the role of firm effects in wage dispersion. In our empirical research, individual heterogeneity explains more than 60% of wage dispersion.
    Keywords: fixed effects, algorithm, wage decomposition, censoring, simulation, assortative matching
    JEL: I21 I24 J16 J31
    Date: 2013–09
    URL: http://d.repec.org/n?u=RePEc:pab:wpaper:13.05&r=ecm
  14. By: Vincent BOUCHER; Ismael MOURIFIÉ
    Abstract: We explore the asymptotic properties of pairwise stables networks (Jackson and Wolinsky, 1996). Specifically, we want to recover a set of parameters from the individuals' utility functions using the observation of a single pairwise stable network. We develop Pseudo Maximum Likelihood estimator and show that it is consistent and asymptotically normally distributed under a very weak version of homophily. The approach is compelling as it provides explicit, easy-to-check conditions on the admissible set of preferences. Moreover, the method is easily implementable using pre-programmed estimators available in most statistical packages. We provide an application of our method using the Add Health database.
    Keywords: social network, pairwise stability, spatial econometrics
    JEL: C13 D85
    Date: 2013–10–01
    URL: http://d.repec.org/n?u=RePEc:tor:tecipa:tecipa-499&r=ecm
  15. By: Stein-Erik, Fleten; Paraschiv, Florentina; Schürle, Michel
    Abstract: We propose a novel regime-switching approach for the simulation of electricity spot prices that is inspired by the class of fundamental models and takes into account the relation between spot and forward prices. Additionally the model is able to reproduce spikes and negative prices. Market prices are derived given an observed forward curve. We distinguish between a base regime and an upper as well as a lower spike regime. The model parameters are calibrated using historical hourly price forward curves for EEX Phelix and the dynamic of hourly spot prices. We further evaluate different time series models such as ARMA and GARCH that are usually applied for modeling electricity prices and conclude a better performance of the proposed regime-switching model.
    Keywords: electricity prices, regime-switching model, negative prices, spikes, price forward curves
    Date: 2013–07
    URL: http://d.repec.org/n?u=RePEc:usg:sfwpfi:2013:11&r=ecm
  16. By: Michael Bleaney; Zhiyong Li
    Abstract: The performance of bid-ask spread estimators is investigated using simulation experiments. All estimators are much more accurate if the data are sampled at high frequency. In high-frequency data, the Huang-Stoll estimator, which requires order flow information, generally outperforms Roll-type estimators based on price information only. The exception is when there is feedback trading (order flows respond to past price movements), when the Huang-Stoll estimator is seriously biased. When only low-frequency (e.g. daily) data are available, the Corwin-Schultz estimator based on daily high and low prices is usually less inaccurate than the Huang-Stoll and Roll estimators. An important and empirically relevant exception is when the spread varies within the day; in this case the Corwin-Schultz estimator significantly overestimates the true spread.
    Date: 2013–09
    URL: http://d.repec.org/n?u=RePEc:not:notecp:13/05&r=ecm

This nep-ecm issue is ©2013 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.