nep-ecm New Economics Papers
on Econometrics
Issue of 2013‒06‒30
thirteen papers chosen by
Sune Karlsson
Orebro University

  1. Estimating nonlinear DSGE models with moments based methods By Sergei Ivashchenko
  2. A survey of semiparametric efficiency bounds for some microeconometric models By Thomas A Severini; Gautam Tripathi
  3. Vector Autoregression with Mixed Frequency Data By Qian, Hang
  4. Dynamic mixture-of-experts models for longitudinal and discrete-time survival data By Quiroz, Matias; Villani, Mattias
  5. Distortions of multivariate distribution functions and associated level curves: applications in multivariate risk theory By Elena Di Bernardino; Didier Rullière
  6. On the distortions of Archimedean copulas: Application to the non-parametric estimation of their generators By Elena Di Bernardino; Didier Rullière
  7. A simple but efficient approach to the analysis of multilevel data By Bache, Stefan Holst Milton; Kristensen, Troels
  8. Disentangling the Effects of Multiple Treatments -Measuring the Net Economic Impact of the 1995 Great Hanshin-Awaji Earthquake By Hiroshi Fujiki; Cheng Hsiao
  9. Invariant Inference and Efficient Computation in the Static Factor Model By Joshua C.C. Chan; Roberto Leon-Gonzalez; Rodney W. Strachan
  10. Exploring or reducing noise? A global optimization algorithm in the presence of noise By Didier Rullière; Alaeddine Faleh; Frédéric Planchet; Wassim Youssef
  11. Un-truncating VARs By De Graeve, Ferre; Westermark, Andreas
  12. Panel data discrete choice models of consumer demand By Michael P. Keane
  13. Compound Wishart Matrices and Noisy Covariance Matrices: Risk Underestimation By Beno\^it Collins; David McDonald; Nadia Saad

  1. By: Sergei Ivashchenko
    Abstract: This article suggests new approach to approximation of moments of nonlinear DSGE models. These approximations are fast and accurate enough to use them for estimation of parameters of nonlinear DSGE models. A small financial DSGE model is repeatedly estimated by several approaches. Approximations of moments are close to moments calculated for large sample simulations. The quality of estimation with suggested approach is close to the Central Difference Kalman Filter (CDKF) based. At the same time suggested approach is much faster.
    Keywords: DSGE, DSGE-VAR, GMM, nonlinear estimation
    JEL: C13 C32 E32
    Date: 2013–06–11
  2. By: Thomas A Severini (Northwestern University, Evanston, USA); Gautam Tripathi (CREA, University of Luxembourg)
    Abstract: In this survey, we evaluate estimators by comparing their asymptotic variances. The role of the effciency bound, in this context, is to give a lower bound to the asymptotic variance of an estimator. An estimator with asymptotic variance equal to the efficiency bound can therefore be said to be asymptotically efficient. These bounds are also useful for understanding how the features of a given model affect the accuracy of parameter estimation.
    Keywords: Efficiency bounds, Semiparametric models
    JEL: C14
    Date: 2013
  3. By: Qian, Hang
    Abstract: Three new approaches are proposed to handle mixed frequency Vector Autoregression. The first is an explicit solution to the likelihood and posterior distribution. The second is a parsimonious, time-invariant and invertible state space form. The third is a parallel Gibbs sampler without forward filtering and backward sampling. The three methods are unified since all of them explore the fact that the mixed frequency observations impose linear constraints on the distribution of high frequency latent variables. By a simulation study, different approaches are compared and the parallel Gibbs sampler outperforms others. A financial application on the yield curve forecast is conducted using mixed frequency macro-finance data.
    Keywords: VAR, Temporal aggregation, State space, Parallel Gibbs sampler
    JEL: C11 C32 C82
    Date: 2013–06
  4. By: Quiroz, Matias (Research Department, Central Bank of Sweden); Villani, Mattias (Linköping University)
    Abstract: We propose a general class of flexible models for longitudinal data with special emphasis on discrete-time survival data. The model is a finite mixture model where the subjects are allowed to move between components through time. The time-varying probability of component memberships is modeled as a function of subject-specific time-varying covariates. This allows for interesting within-subject dynamics and manageable computations even with a large number of subjects. Each parameter in the component densities and in the mixing function is connected to its own set of covariates through a link function. The models are estimated using a Bayesian approach via a highly efficient Markov Chain Monte Carlo (MCMC) algorithm with tailored proposals and variable selection in all set of covariates. The focus of the paper is on models for discrete-time survival data with an application to bankruptcy prediction for Swedish firms, using both exponential and Weibull mixture components. The dynamic mixture-of-experts models are shown to have an interesting interpretation and to dramatically improve the out-of-sample predictive density forecasts compared to models with time-invariant mixture probabilities.
    Keywords: Bayesian inference; Markov Chain Monte Carlo; Bayesian variable selection; Survival Analysis; Mixture-of-experts
    JEL: C11 C41 D21 G33
    Date: 2013–05–01
  5. By: Elena Di Bernardino (CEDRIC - Centre d'Etude et De Recherche en Informatique du Cnam - Conservatoire National des Arts et Métiers (CNAM)); Didier Rullière (SAF - Laboratoire de Sciences Actuarielle et Financière - Université Claude Bernard - Lyon I : EA2429)
    Abstract: In this paper, we propose a parametric model for multivariate distributions. The model is based on distortion functions, i.e. some transformations of a multivariate distribution which permit to generate new families of multivariate distribution functions. We derive some properties of considered distortions. A suitable proximity indicator between level curves is introduced in order to evaluate the quality of candidate distortion parameters. Using this proximity indicator and properties of distorted level curves, we give a speci c estimation procedure. The estimation algorithm is mainly relying on straightforward univariate optimizations, and we nally get parametric representations of both multivariate distribution functions and associated level curves. Our results are motivated by applications in multivariate risk theory. The methodology is illustrated on simulated and real examples.
    Keywords: Multivariate probability distortions, Level sets estimation , Iterated compositions, Hyperbolic conversion functions , Multivariate risk measures.
    Date: 2013–05–04
  6. By: Elena Di Bernardino (CEDRIC - Centre d'Etude et De Recherche en Informatique du Cnam - Conservatoire National des Arts et Métiers (CNAM)); Didier Rullière (SAF - Laboratoire de Sciences Actuarielle et Financière - Université Claude Bernard - Lyon I : EA2429)
    Abstract: We study the impact of some distortions for Archimedean copulas. We give some admissibility conditions for these distortions, and define some equivalence classes for both distortions and generators of Archimedean copulas. We investigate some impacts of the distortions on the tails of the distorted copula. We extend the r-fold composition of the diagonal section of a copula, from r in N to r in R. This extension, coupled with results on equivalence classes, gives us new expressions of distortions and generators. Estimators deriving directly from these expressions are proposed and their convergence is investigated. We provide confidence bands for the estimated generators. Numerical illustrations show the empirical performance of these estimators.
    Keywords: Distortions; Archimedean copula; auto-nested copula; non-parametric estimation; tail dependence
    Date: 2013–06–13
  7. By: Bache, Stefan Holst Milton (COHERE, Deparetment of Business and Economics); Kristensen, Troels (COHERE, Department of Budiness and Economics)
    Abstract: Much research in health economics revolves around the analysis of hierarchically structured data. For instance, combining characteristics of patients with information pertaining to the general practice (GP) clinic providing treatment is called for in order to investigate important features of the underlying nested structure. In this paper we offer a new treatment of the two-level random-intercept model and state equivalence results for specific estimators, including popular two-step estimators. We show that a certain encompassing regression equation, based on a Mundlak-type specification, provides a surprisingly simple approach to efficient estimation and a straightforward way to assess the assumptions required. As an illustration, we combine unique information on the morbidity of Danish type 2 diabetes patients with information about GP clinics to investigate the association with fee-for-service healthcare expenditure. Our approach allows us to conclude that explanatory power is mainly provided by patient information and patient mix, whereas (possibly unobserved) clinic characteristics seem to play a minor role.
    Keywords: Multilevel models; random intercepts; nested models; Mundlak device; correlated random effects; 2-step estimation; estimated dependent variables; fee-for-service expenditures; type 2 diabetes
    JEL: C01 C18 C38 H51 I18
    Date: 2013–06–20
  8. By: Hiroshi Fujiki (Associate Director-General and Senior Economist, Institute for Monetary and Economic Studies, Bank of Japan (E-mail:; Cheng Hsiao (Professor, Department of Economics, University of Southern California (E-mail:
    Abstract: We propose a panel data approach to disentangle the impact ofgone treatmenth from the gother treatmenth when the observed outcomes are subject to both treatments. We use the Great Hanshin-Awaji earthquake that took place on January 17, 1995 to illustrate our methodology. We find that there were no persistent earthquake effects. The observed persistent effects are due to structural change in Hyogo prefecture.
    Keywords: Multiple Treatment Effects, Panel Data, Great Hanshin-Awaji Earthquake
    JEL: C18 C23 C52
    Date: 2013–06
  9. By: Joshua C.C. Chan; Roberto Leon-Gonzalez; Rodney W. Strachan
    Abstract: Factor models are used in a wide range of areas. Two issues with Bayesian versions of these models are a lack of invariance to ordering of the variables and computational inefficiency. This paper develops invariant and efficient Bayesian methods for estimating static factor models. This approach leads to inference on the number of factors that does not depend upon the ordering of the variables, and we provide arguments to explain this invariance. Beginning from identified parameters which have nonstandard forms, we use parameter expansions to obtain a specification with standard conditional posteriors. We show significant gains in computational efficiency. Identifying restrictions that are commonly employed result in interpretable factors or loadings and, using our approach, these can be imposed ex-post. This allows us to investigate several alternative identifying schemes without the need to respecify and resample the model. We apply our methods to a simple example using a macroeconomic dataset.
    Date: 2013–05
  10. By: Didier Rullière (SAF - Laboratoire de Sciences Actuarielle et Financière - Université Claude Bernard - Lyon I : EA2429); Alaeddine Faleh (SAF - Laboratoire de Sciences Actuarielle et Financière - Université Claude Bernard - Lyon I : EA2429); Frédéric Planchet (SAF - Laboratoire de Sciences Actuarielle et Financière - Université Claude Bernard - Lyon I : EA2429); Wassim Youssef (Winter & associés - Winter & associés)
    Abstract: We consider the problem of the global minimization of a function observed with noise. This problem occurs for example when the objective function is estimated through stochastic simulations. We propose an original method for iteratively partitioning the search domain when this area is a nite union of simplexes. On each subdomain of the partition, we compute an indicator measuring if the subdomain is likely or not to contain a global minimizer. Next areas to be explored are chosen in accordance with this indicator. Con dence sets for minimizers are given. Numerical applications show empirical convergence results, and illustrate the compromise to be made between the global exploration of the search domain and the focalization around potential minimizers of the problem.
    Keywords: Golbal Optimisation; Simplex; Branch-and-Bound; Kriging
    Date: 2013
  11. By: De Graeve, Ferre (Research Department, Central Bank of Sweden); Westermark, Andreas (Research Department, Central Bank of Sweden)
    Abstract: Macroeconomic research often relies on structural vector autoregressions to uncover empirical regularities. Critics argue the method goes awry due to lag truncation: short lag-lengths imply a poor approximation to DSGE-models. Empirically, short lag-length is deemed necessary as increased parametrization induces excessive uncertainty. The paper shows that this argument is incomplete. Longer lag-length simultaneously reduces misspecification, which in turn reduces variance. For data generated by frontier DSGE-models long-lag VARs are feasible, reduce bias and variance, and have better coverage. Thus, contrary to conventional wisdom, the trivial solution to the critique actually works.
    Keywords: VAR; SVAR; Lag-length; Truncation
    JEL: C18 E37
    Date: 2013–06–01
  12. By: Michael P. Keane (Nuffield College and Department of Economics, University of Oxford)
    Date: 2013–06–03
  13. By: Beno\^it Collins; David McDonald; Nadia Saad
    Abstract: In this paper, we obtain a property of the expectation of the inverse of compound Wishart matrices which results from their orthogonal invariance. Using this property as well as results from random matrix theory (RMT), we derive the asymptotic effect of the noise induced by estimating the covariance matrix on computing the risk of the optimal portfolio. This in turn enables us to get an asymptotically unbiased estimator of the risk of the optimal portfolio not only for the case of independent observations but also in the case of correlated observations. This improvement provides a new approach to estimate the risk of a portfolio based on covariance matrices estimated from exponentially weighted moving averages of stock returns.
    Date: 2013–06

This nep-ecm issue is ©2013 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.