nep-ecm New Economics Papers
on Econometrics
Issue of 2014‒09‒05
fourteen papers chosen by
Sune Karlsson
Örebro universitet

  1. Nonparametric Inference on Quantile Marginal Effects By David Kaplan
  3. Bootstrapping Kernel-Based Semiparametric Estimators By Matias D. Cattaneo; Michael Jansson
  4. Spectrum-based estimators of the bivariate Hurst exponent By Ladislav Kristoufek
  5. Putting the Patient in Patient Reported Outcomes: A Robust Methodology for Health Outcomes Assessment By Ian M. McCarthy
  6. Treatment Effects on Combined Outcomes: An Application to Health-related Quality-of-life Data By Ian M. McCarthy
  7. A Noisy Principal Component Analysis for Forward Rate Curves By Marcio Laurini; Alberto Ohashi
  8. Why High-order Polynomials Should not be Used in Regression Discontinuity Designs By Andrew Gelman; Guido Imbens
  9. Model Uncertainty in Panel Vector Autoregressive Models By Koop, Gary; Korobilis, Dimitris
  10. Density Forecasting using Bayesian Global Vector Autoregressions with Common Stochastic Volatility By Florian Huber
  11. Assessing Bayesian Model Comparison in Small Samples By Martinez-Garcia, Enrique; Wynne, Mark A.
  12. Econometric Methods for Modelling Systems with a Mixture of I(1) and I(0) Variables By Hyeon-Seung Huh; Lance Fisher; Adrian Pagan
  13. Disentangled Jump-Robust Realized Covariances and Correlations with Non-Synchronous Prices By Harry-Paul Vander Elst; David Veredas
  14. Estimating time-varying DSGE models using minimum distance methods By Giraitis, Liudas; Kapetanios, George; Theodoridis, Konstantinos; Yates, Tony

  1. By: David Kaplan (Department of Economics, University of Missouri-Columbia)
    Abstract: We propose a nonparametric method to construct confidence intervals for quantile marginal effects (i.e., derivatives of the conditional quantile function). Under certain conditions, a quantile marginal effect equals a causal (structural) effect in a general nonseparable model, or equals an average thereof within a particular subpopulation. The high- order accuracy of our method is derived. Simulations and an empirical example demonstrate the new method's favorable performance and practical use. Code for the new method is provided.
    Keywords: fractional order statistics, high-order accuracy, nonseparable models
    JEL: C21
    Date: 2014–08–19
  2. By: Kusdhianto Setiawan; Koichi Maekawa
    Abstract: The standard vector error correction (VEC) model assumes the iid normal distribution of disturbance term in the model. This paper extends this assumption to include GARCH process. We call this model as VEC-GARCH model. However as the number of parameters in a VEC-GARCH model is large, the maximum likelihood (ML) method is computationally demanding. To overcome these computational difficulties, the first part of this paper searches for alternative estimation methods and compares them by Monte Carlo simulation based on a relatively small scale VEC-GARCH model; an unrestricted VECM equation system with three variables and lag of 1. After rewriting VEC-GARCH model into Seemingly Unrelated Regression (SUR) model we apply a feasible generalized least square (FGLS) estimator. As a result FGLS estimator shows comparable performance to ML estimator. Furthermore a small scale of empirical study is presented to see the applicability of the FGLS. In our simulation we found that the performance of FGLS-GARCH estimator is as good as that of MLE and both estimators are better than OLS and the standard VECM that ignore the error structure.we apply a VEC-GARCH model to real international asset pricing data and test conditional CAPM by using FGLS-GARCH estimation strategy. Since our model is relatively large; it is involving 12 stock market indexes, computational problems arise in estimating the expected returns under VEC-GARCH model and in testing the conditional CAPM by using MLE. Considering the heteroscedasticity and cross-correlation in the error terms of international stock market returns, International Capital Asset Pricing Model (CAPM) is reinvestigated under SUR with GARCH (SUR-GARCH) errors. We modified FGLS estimator to take into account multivariate GARCH error structure in estimating the model. World market portfolio was constructed to ensure that the market portfolio is mean-variance efficient under no restriction on short selling and borrowing at riskless rate. CAPM fits well only on ex-post SUR test, but it is rejected on SUR-GARCH for both ex-ante and ex-post test. However, this paper found that CAPM could be applied for most stock market indexes when each equation was analyzed individually.
    Keywords: United States, United Kingdom, Germany, Singapore, Hong Kong, Argentina, Brazil, China, Indonesia, Malaysia, Mexico, Forecasting and projection methods, Finance
    Date: 2014–07–03
  3. By: Matias D. Cattaneo (University of Michigan); Michael Jansson (UC Berkeley and CREATES)
    Abstract: This paper develops alternative asymptotic results for a large class of two-step semiparametric estimators. The first main result is an asymptotic distribution result for such estimators and differs from those obtained in earlier work on classes of semiparametric two-step estimators by accommodating a non-negligible bias. A noteworthy feature of the assumptions under which the result is obtained is that reliance on a commonly employed stochastic equicontinuity condition is avoided. The second main result shows that the bootstrap provides an automatic method of correcting for the bias even when it is non-negligible.
    Keywords: Semiparametric estimation, bootstrapping, asymptotic separability.
    JEL: C14 C15
    Date: 2014–07–30
  4. By: Ladislav Kristoufek
    Abstract: We introduce two new estimators of the bivariate Hurst exponent in the power-law cross-correlations setting -- the cross-periodogram and $X$-Whittle estimators. As the spectrum-based estimators are dependent on the part of the spectrum taken into consideration during estimation, a simulation study showing the performance of the estimators under varying bandwidth parameter as well as correlation between processes and their specification is provided as well. The newly introduced estimators are less biased than the already existent averaged periodogram estimator which, however, has slightly lower variance. The spectrum-based estimators can serve as a good complement to the popular time domain estimators.
    Date: 2014–08
  5. By: Ian M. McCarthy
    Abstract: When analyzing many health-related quality-of-life (HRQoL) outcomes, statistical inference is often based on the summary score formed by combining the individual domains of the HRQoL profile into a single measure. Through a series of Monte Carlo simulations, this paper illustrates that reliance solely on the summary score may lead to biased estimates of incremental effects, and I propose a novel two-stage approach that allows for unbiased estimation of incremental effects. The proposed methodology essentially reverses the order of the analysis, from one of "aggregate, then estimate" to one of "estimate, then aggregate." Compared to relying solely on the summary score, the approach also offers a more patient-centered interpretation of results by estimating regression coefficients and incremental effects in each of the HRQoL domains, while still providing estimated effects in terms of the overall summary score. I provide an application to the estimation of incremental effects of demographic and clinical variables on HRQoL following surgical treatment for adult scoliosis and spinal deformity.
    Date: 2014–08
  6. By: Ian M. McCarthy
    Abstract: A variety of empirical techniques now exist to estimate average treatment effects when treatment participation is subject to selection on observed variables. Applied researchers often adopt these same tools for combined outcomes, in which the ultimate outcome of interest is formed as some combination of two or more underlying outcome variables. The current paper illustrates that an analysis based solely on the combined outcome yields biased treatment effects estimates when the underlying outcome variables are discrete. This is particularly relevant in the assessment of quality-adjusted life-years (QALYs), which are formed in-part by aggregating a multivariate health-related quality-of-life (HRQoL) profile into a single summary score. The analysis adopts an alternative two-step estimator that first estimates the treatment effect on each individual outcome and then reinterprets the treatment effect in terms of the combined outcome based on predicted values from the first-stage regressions. Focusing on HRQoL outcomes, the two-stage estimator is shown to restore the unbiased estimation of treatment effects on the combined outcome under a variety of alternative data generating processes. An application to the study of HRQoL outcomes following complex spine surgery is also provided.
    Date: 2014–08
  7. By: Marcio Laurini; Alberto Ohashi
    Abstract: Principal Component Analysis (PCA) is the most common nonparametric method for estimating the volatility structure of Gaussian interest rate models. One major difficulty in the estimation of these models is the fact that forward rate curves are not directly observable from the market so that non-trivial observational errors arise in any statistical analysis. In this work, we point out that the classical PCA analysis is not suitable for estimating factors of forward rate curves due to the presence of measurement errors induced by market microstructure effects and numerical interpolation. Our analysis indicates that the PCA based on the long-run covariance matrix is capable to extract the true covariance structure of the forward rate curves in the presence of observational errors. Moreover, it provides a significant reduction in the pricing errors due to noisy data typically founded in forward rate curves.
    Date: 2014–08
  8. By: Andrew Gelman; Guido Imbens
    Abstract: It is common in regression discontinuity analysis to control for high order (third, fourth, or higher) polynomials of the forcing variable. We argue that estimators for causal effects based on such methods can be misleading, and we recommend researchers do not use them, and instead use estimators based on local linear or quadratic polynomials or other smooth functions.
    JEL: C01 C1
    Date: 2014–08
  9. By: Koop, Gary; Korobilis, Dimitris
    Abstract: We develop methods for Bayesian model averaging (BMA) or selection (BMS) in Panel Vector Autoregressions (PVARs). Our approach allows us to select between or average over all possible combinations of restricted PVARs where the restrictions involve interdependencies between and heterogeneities across cross-sectional units. The resulting BMA framework can find a parsimonious PVAR specification, thus dealing with overparameterization concerns. We use these methods in an application involving the euro area sovereign debt crisis and show that our methods perform better than alternatives. Our findings contradict a simple view of the sovereign debt crisis which divides the euro zone into groups of core and peripheral countries and worries about financial contagion within the latter group.
    Keywords: Bayesian model averaging, stochastic search variable selection, financial contagion, sovereign debt crisis
    JEL: C11 C33 C52 G10
    Date: 2014
  10. By: Florian Huber (Department of Economics, Vienna University of Economics and Business)
    Abstract: This paper puts forward a Bayesian Global Vector Autoregressive Model with Common Stochastic Volatility (B-GVAR-CSV). We assume that country specific volatility is driven by a single latent stochastic process, which simplifies the analysis and implies significant computational gains. Apart from computational advantages, this is also justified on the ground that the volatility of most macroeconomic quantities considered in our application tends to follow a similar pattern. Furthermore, Minnesota priors are used to introduce shrinkage to cure the curse of dimensionality. Finally, this model is then used to produce predictive densities for a set of macroeconomic aggregates. The dataset employed consists of quarterly data spanning from 1995:Q1 to 2012:Q4 and includes 45 economies plus the Euro Area. Our results indicate that stochastic volatility specifications influences accuracy along two dimensions: First, it helps to increase the overall predictive fit of our model. This result can be seen for some variables under scrutiny, most notably for real GDP and short-term interest rates. Second, it helps to make the model more resilient with respect to outliers and economic crises. This implies that when evaluated over time, the log predictive scores tend to show significantly less variation as compared to homoscedastic models.
    Keywords: Density Forecasting, Stochastic Volatility, Global vector autoregressions
    JEL: C32 F44 E32 E47
    Date: 2014–07
  11. By: Martinez-Garcia, Enrique (Federal Reserve Bank of Dallas); Wynne, Mark A. (Federal Reserve Bank of Dallas)
    Abstract: We investigate the Bayesian approach to model comparison within a two-country framework with nominal rigidities using the workhorse New Keynesian open-economy model of Martínez-García and Wynne (2010). We discuss the trade-offs that monetary policy characterized by a Taylor-type rule faces in an interconnected world, with perfectly flexible exchange rates. We then use posterior model probabilities to evaluate the weight of evidence in support of such a model when estimated against more parsimonious specifications that either abstract from monetary frictions or assume autarky by means of controlled experiments that employ simulated data. We argue that Bayesian model comparison with posterior odds is sensitive to sample size and the choice of observable variables for estimation. We show that posterior model probabilities strongly penalize overfitting which can lead us to favor a less parameterized model against the true data-generating process when the two become arbitrarily close to each other. We also illustrate that the spill-overs from monetary policy across countries have an added confounding effect.
    Keywords: Bayesian
    JEL: C11 C13 F41
    Date: 2014–08–01
  12. By: Hyeon-Seung Huh; Lance Fisher; Adrian Pagan
    Abstract: This paper considers structural models when both I(1) and I(0) variables are present. The structural shocks associated with either set of variables could be permanent or transitory. We therefore classify the shocks as (P1,P0) and (T1,T0), where P/T distinguishes permanent and transitory, while 1/0 means they are attached to either I(1) or I(0) variables. We first analyse what happens when there are P0 shocks. This is done using a sequence of examples and shows a variety of outcomes that differ from standard results in the cointegration literature. Then conditions are derived upon the nature of the SVAR in the event that T0 (and no P0) shocks are present. Following this a general method that allows for either P0 or T0 shocks is described and related to the literature that treats I(0) variables as cointegrating with themselves. Finally, we turn to an examination of a well-known empirical SVAR where there are P0 shocks. This SVAR is re-formulated so that the extra shock coming from the introduction of an I(0) variable does not affect relative prices in the long-run i.e. it is T0, and it is found that this has major implications for whether there is a price puzzle. It is also shown how to handle long-run parametric restrictions in the presence of P0 shocks when some shocks are identified using sign restrictions.Please see attachment.Please see attachment.
    Keywords: Please see attachment., Macroeconometric modeling, Macroeconometric modeling
    Date: 2014–07–03
  13. By: Harry-Paul Vander Elst; David Veredas
    Keywords: realized measures; noise; jumps; synchronization
    JEL: C50
    Date: 2014–08
  14. By: Giraitis, Liudas (Queen Mary, University of London); Kapetanios, George (Queen Mary, University of London); Theodoridis, Konstantinos (Bank of England); Yates, Tony (University of Bristol and Centre for Macroeconomics)
    Abstract: This paper uses kernel methods to estimate a seven variable time-varying (TV) vector autoregressive (VAR) model on the US data set constructed by Smets and Wouters. We use an indirect inference method to map from this TV VAR to time variation in implied dynamic stochastic general equilibrium (DSGE) parameters. We find that many parameters change substantially, particularly those defining nominal rigidities, habits and investment adjustment costs. In contrast to the ‘Great Moderation’ literature our monetary policy parameter estimates suggest that authorities tried to deliver a low and stable inflation from 1975 onwards. However, the severe adverse supply shocks in the 70s could have caused these policies to fail.
    Keywords: DSGE; structural change; kernel estimation; time-varying VAR; monetary policy shocks
    JEL: C14 C18 E52 E61 E66
    Date: 2014–08–22

This nep-ecm issue is ©2014 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.