nep-ecm New Economics Papers
on Econometrics
Issue of 2017‒10‒15
eleven papers chosen by
Sune Karlsson
Örebro universitet

  1. Bayesian Parametric and Semiparametric Factor Models for Large Realized Covariance Matrices By Jin, Xin; Maheu, John M; Yang, Qiao
  2. Estimation for time-invariant effects in dynamic panel data models with application to income dynamics By Yonghui Zhang; Qiankun Zhou
  3. Inference on Auctions with Weak Assumptions on Information By Vasilis Syrgkanis; Elie Tamer; Juba Ziani
  4. Term Structure Analysis with Big Data By Andreasen, Martin M.; Christensen, Jens H. E.; Rudebusch, Glenn D.
  5. Large-Scale Portfolio Allocation Under Transaction Costs and Model Uncertainty: Adaptive Mixing of High- and Low-Frequency Information By Hautsch, Nikolaus; Voigt, Stefan
  6. An Integrated Model for Discontinuous Preference Change and Satiation By Nobuhiko Terui; Shohei Hasegawa; Adam N. Smith; Greg M. Allenby
  7. Linear Time Iteration By Rendahl, Pontus
  8. Invariance Axioms and Functional Form Restrictions in Structural Models By Dagsvik, John K
  9. Replicating "Predicting the present with Google trends" by Hyunyoung Choi and Hal Varian (The Economic Record, 2012) By Coupé, Tom
  10. Which tests not witch hunts: A diagnostic approach for conducting replication research By Wood, Benjamin Douglas Kuflick
  11. Replication to assess statistical adequacy By Owen, Dorian

  1. By: Jin, Xin; Maheu, John M; Yang, Qiao
    Abstract: This paper introduces a new factor structure suitable for modeling large realized covariance matrices with full likelihood based estimation. Parametric and nonparametric versions are introduced. Due to the computational advantages of our approach we can model the factor nonparametrically as a Dirichlet process mixture or as an infinite hidden Markov mixture which leads to an infinite mixture of inverse-Wishart distributions. Applications to 10 assets and 60 assets show the models perform well. By exploiting parallel computing the models can be estimated in a matter of a few minutes.
    Keywords: infinite hidden Markov model, Dirichlet process mixture, inverse-Wishart, predictive density, high-frequency data
    JEL: C11 C14 C32 C58 G17
    Date: 2017–10–12
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:81920&r=ecm
  2. By: Yonghui Zhang; Qiankun Zhou
    Abstract: A two-step estimation procedure is proposed to estimate the time-invariant effects, i.e., the slopes of the time-invariant regressors, in dynamic panel data models. In the first step, generalized method of moments (GMM) is used to estimate the time-varying effects, and the second step is to run cross-sectional OLS regression of the time series average of the residuals from the GMM estimation on the time-invariant regressors to estimate the time-invariant effects. It is shown that the OLS estimator of time-invariant effects is pN-consistent and asymptotically normally distributed. A consistent estimator for the asymptotic variance of the estimator is also provided, which is robust to errors with heteroscedasticity and works well even if the errors are serially correlated. Monte Carlo simulations confirm the theoretical findings. Application to income dynamics highlights the importance of estimating time- invariant effects such as education, race and gender in return to schooling.
    Keywords: Dynamic panel, GMM, OLS, Time-invariant effects, Return to schooling.
    Date: 2017–10
    URL: http://d.repec.org/n?u=RePEc:lsu:lsuwpp:2017-12&r=ecm
  3. By: Vasilis Syrgkanis; Elie Tamer; Juba Ziani
    Abstract: Given a sample of bids from independent auctions, this paper examines the question of inference on auction fundamentals (e.g. valuation distributions, welfare measures) under weak assumptions on information structure. The question is important as it allows us to learn about the valuation distribution in a robust way, i.e., without assuming that a particular information structure holds across observations. We leverage recent contributions in the robust mechanism design literature that exploit the link between Bayesian Correlated Equilibria and Bayesian Nash Equilibria in incomplete information games to construct an econometrics framework for learning about auction fundamentals using observed data on bids. We showcase our construction of identified sets in private value and common value auctions. Our approach for constructing these sets inherits the computational simplicity of solving for correlated equilibria: checking whether a particular valuation distribution belongs to the identified set is as simple as determining whether a linear program is feasible. A similar linear program can be used to construct the identified set on various welfare measures and counterfactual objects. For inference and to summarize statistical uncertainty, we propose novel finite sample methods using tail inequalities that are used to construct confidence regions on sets. We also highlight methods based on Bayesian bootstrap and subsampling. A set of Monte Carlo experiments show adequate finite sample properties of our inference procedures. We also illustrate our methods using data from OCS auctions.
    Date: 2017–10
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1710.03830&r=ecm
  4. By: Andreasen, Martin M. (Aarhus University); Christensen, Jens H. E. (Federal Reserve Bank of San Francisco); Rudebusch, Glenn D. (Federal Reserve Bank of San Francisco)
    Abstract: Analysis of the term structure of interest rates almost always takes a two-step approach. First, actual bond prices are summarized by interpolated synthetic zero-coupon yields, and second, a small set of these yields are used as the source data for further empirical examination. In contrast, we consider the advantages of a one-step approach that directly analyzes the universe of bond prices. To illustrate the feasibility and desirability of the onestep approach, we compare arbitrage-free dynamic term structure models estimated using both approaches. We also provide a simulation study showing that a one-step approach can extract the information in large panels of bond prices and avoid any arbitrary noise introduced from a first-stage interpolation of yields.
    JEL: C58 G12 G17
    Date: 2017–09–15
    URL: http://d.repec.org/n?u=RePEc:fip:fedfwp:2017-21&r=ecm
  5. By: Hautsch, Nikolaus; Voigt, Stefan
    Abstract: We propose a Bayesian sequential learning framework for high-dimensional asset al-locations under model ambiguity and parameter uncertainty. The model is estimated via MCMC methods and allows for a wide range of data sources as inputs. Employing the proposed framework on a large set of NASDAQ-listed stocks, we observe that time-varying mixtures of high- and low-frequency based return predictions significantly improve the out-of-sample portfolio performance.
    JEL: C52 C11 C58 G11
    Date: 2017
    URL: http://d.repec.org/n?u=RePEc:zbw:vfsc17:168222&r=ecm
  6. By: Nobuhiko Terui; Shohei Hasegawa; Adam N. Smith; Greg M. Allenby
    Abstract: We develop a structural model of horizontal and temporal variety seeking using a dynamic factor model that relates attribute satiation to brand preferences. The factor model employs a threshold specification that triggers preference changes when customer satiation exceeds an admissible level but does not change otherwise. The factor model is developed for high dimensional switching data encountered when multiple brands are purchased across multiple time periods. The model is applied to two scanner-panel datasets where we find distinct shifts in consumer preferences over time where consumers are found to value variety much more than indicated by traditional models. Insights into brand preference are provided by a dynamic joint space map that displays brand positions and temporal changes in consumer preferences over time.
    Date: 2017–09–28
    URL: http://d.repec.org/n?u=RePEc:toh:dssraa:70&r=ecm
  7. By: Rendahl, Pontus (University of Cambridge, Faculty of Economics)
    Abstract: This paper proposes a simple iterative method – time iteration – to solve linear rational expectation models. I prove that this method converges to the solution with the smallest eigenvalues in absolute value, and provide the conditions under which this solution is unique. In particular, if conditions similar to those of Blanchard and Kahn (1980) are met, the procedure converges to the unique stable solution. Apart from its transparency and simplicity of implementation, the method provides a straightforward approach to solving models with less standard features, such as regime switching models. For large-scale problems the method is 10-20 times faster than existing solution methods.
    Keywords: Linear systems, rational expectation models, fixed point iteration
    JEL: C02 C61 C62 C63
    Date: 2017–09
    URL: http://d.repec.org/n?u=RePEc:ihs:ihsesp:330&r=ecm
  8. By: Dagsvik, John K (SSB)
    Abstract: The dominant practice in economics is to choose the mathematical specification of model relations on the basis of convenience, without much theoretical support. This paper discusses how quantitative model specifications can, in some cases, be given a more formal scientific underpinning in the sense of being based on a priori theory. I use an example from discrete choice theory to illustrate that it is sometimes possible to obtain a complete characterization of the choice model derived from a set of plausible axioms. Furthermore, I discuss how axioms can be tested non-parametrically, given that suitable Stated Preference data are available.
    Keywords: Functional form; Theory of measurement; Invariance principles; Independence from Irrelevant Alternatives; Testing of inequality hypotheses
    JEL: C40 C51 D12
    Date: 2017–08–30
    URL: http://d.repec.org/n?u=RePEc:hhs:osloec:2017_008&r=ecm
  9. By: Coupé, Tom
    Abstract: In this note, the author describes different ways one could try to replicate Choi and Varian (Predicting the present with Google trends, The Economic Record, 2012).
    Keywords: Replication
    JEL: A1 C1 C8 C53
    Date: 2017
    URL: http://d.repec.org/n?u=RePEc:zbw:ifwedp:201776&r=ecm
  10. By: Wood, Benjamin Douglas Kuflick
    Abstract: This paper provides researchers with an objective list of checks to consider when planning a replication study with the objective of validating findings for informing policy. These replication studies should begin with a pure replication of the published results and then reanalyse the original data to address the original research question. The author presents tips for replication exercises in four categories: validity of assumptions, data transformations, estimation methods, and heterogeneous impacts. For each category he offers an introduction, a tips checklist, some examples of how these checks have been employed, and a set of resources that provide statistical and econometric details.
    Keywords: Replication,diagnostic,validation,impact evaluation,reanalysis,risk of bias
    JEL: C10 B41 A20
    Date: 2017
    URL: http://d.repec.org/n?u=RePEc:zbw:ifwedp:201777&r=ecm
  11. By: Owen, Dorian
    Abstract: "Statistical adequacy" is an important prerequisite for securing reliable inference in empirical modelling. This paper argues for more emphasis on replication that specifically assesses whether the results reported in empirical studies are based on statistically adequate models, i.e., models with valid underpinning statistical assumptions that satisfy relevant diagnostic tests of misspecification. A replication plan is briefly outlined to illustrate what this would involve in practice in the context of a specific study by Acemoglu, Gallego and Robinson (Institutions, human capital, and development, Annual Review of Economics, 2014).
    Keywords: replication,statistical adequacy,inference,instrumental variables,reduced form,fundamental determinants of economic development
    JEL: C31 C36 I25 P14 O10
    Date: 2017
    URL: http://d.repec.org/n?u=RePEc:zbw:ifwedp:201773&r=ecm

This nep-ecm issue is ©2017 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.