nep-ecm New Economics Papers
on Econometrics
Issue of 2015‒07‒18
nineteen papers chosen by
Sune Karlsson
Örebro universitet

  1. Sieve Semiparametric Two-Step GMM under Weak Dependence By Xiaohong Chen; Zhipeng Liao
  2. High-Dimensional Copula-Based Distributions with Mixed Frequency Data By Oh, Dong Hwan; Patton, Andrew J.
  3. Inference in Near Singular Regression By Peter C. B. Phillips
  4. Confidence sets for the date of a break in level and trend when the order of integration is unknown By David Harvey; Stephen Leybourne
  5. Inference Based on Many Conditional Moment Inequalities By Donald W.K. Andrews; Xiaoxia Shi
  6. In-Sample Confidence Bands and Out-of-Sample Forecast Bands for Time-Varying Parameters in Observation Driven Models By Francisco Blasques; Siem Jan Koopman; Katarzyna Łasak; André Lucas
  7. Modelling Dependence in High Dimensions with Factor Copulas By Oh, Dong Hwan; Patton, Andrew J.
  8. Shape Regressions By Franco Peracchi; Samantha Leorato
  9. Misspecification Testing in GARCH-MIDAS Models By Conrad, Christian; Schienle, Melanie
  10. Identification of Nonparametric Simultaneous Equations Models with a Residual Index Structure By Steven T. Berry; Philip A. Haile
  11. Quantile Cointegration in the Autoregressive Distributed-Lag Modelling Framework By JIN SEO CHO; TAE-HWAN KIM; YONGCHEOL SHIN
  12. Backtesting Strategies Based on Multiple Signals By Robert Novy-Marx
  13. Estimating the Competitive Storage Model with Trending Commodity Prices By Christophe Gouel; Nicolas Legrand
  14. Hybrid scheme for Brownian semistationary processes By Mikkel Bennedsen; Asger Lunde; Mikko S. Pakkanen
  15. Statistical Inference and Efficient Portfolio Investment Performance By Shibo Liu; Tom Weyman-Jones; Karligash Glass
  16. Persistence, Signal-Noise Pattern and Heterogeneity in Panel Data: With an Application to the Impact of Foreign Direct Investment on GDP By Biørn, Erik; Han, Xuehui
  17. Causal transmission in reduced-form models By Vassili Bazinas; Bent Nielsen
  18. Surprised by the Gambler’s and Hot Hand Fallacies? A Truth in the Law of Small Numbers By Joshua B. Miller; Adam Sanjurjo
  19. Measuring spot variance spillovers when (co)variances are time-varying – the case of multivariate GARCH models By Fengler, Matthias R.; Herwartz, Helmut

  1. By: Xiaohong Chen (Cowles Foundation, Yale University); Zhipeng Liao (Dept. of Economics, UCLA)
    Abstract: This paper considers semiparametric two-step GMM estimation and inference with weakly dependent data, where unknown nuisance functions are estimated via sieve extremum estimation in the first step. We show that although the asymptotic variance of the second-step GMM estimator may not have a closed form expression, it can be well approximated by sieve variances that have simple closed form expressions. We present consistent or robust variance estimation, Wald tests and Hansen's (1982) over-identification tests for the second step GMM that properly reflect the first-step estimated functions and the weak dependence of the data. Our sieve semiparametric two-step GMM inference procedures are shown to be numerically equivalent to the ones computed as if the first step were parametric. A new consistent random-perturbation estimator of the derivative of the expectation of the non-smooth moment function is also provided.
    Keywords: Sieve two-step GMM, Weakly dependent data, Auto-correlation robust inference, Semiparametric over-identification test, Numerical equivalence, Random perturbation derivative estimator
    JEL: C12 C22 C32 C14
    Date: 2015–07
    URL: http://d.repec.org/n?u=RePEc:cwl:cwldpp:2012&r=ecm
  2. By: Oh, Dong Hwan (Board of Governors of the Federal Reserve System (U.S.)); Patton, Andrew J. (Duke University)
    Abstract: This paper proposes a new model for high-dimensional distributions of asset returns that utilizes mixed frequency data and copulas. The dependence between returns is decomposed into linear and nonlinear components, enabling the use of high frequency data to accurately forecast linear dependence, and a new class of copulas designed to capture nonlinear dependence among the resulting uncorrelated, low frequency, residuals. Estimation of the new class of copulas is conducted using composite likelihood, facilitating applications involving hundreds of variables. In- and out-of-sample tests confirm the superiority of the proposed models applied to daily returns on constituents of the S&P 100 index.
    Keywords: Composite likelihood; forecasting; high frequency data; nonlinear dependence
    JEL: C32 C51 C58
    Date: 2015–05–19
    URL: http://d.repec.org/n?u=RePEc:fip:fedgfe:2015-50&r=ecm
  3. By: Peter C. B. Phillips (Cowles Foundation, Yale University)
    Abstract: This paper considers stationary regression models with near-collinear regressors. Limit theory is developed for regression estimates and test statistics in cases where the signal matrix is nearly singular in finite samples and is asymptotically degenerate. Examples include models that involve evaporating trends in the regressors that arise in conditions such as growth convergence. Structural equation models are also considered and limit theory is derived for the corresponding instrumental variable estimator, Wald test statistic, and overidentification test when the regressors are endogenous.
    Keywords: Endogeneity, Instrumental variable, Overidentification test, Regression, Singular Signal Matrix, Structural equation
    JEL: C23
    Date: 2015–07
    URL: http://d.repec.org/n?u=RePEc:cwl:cwldpp:2009&r=ecm
  4. By: David Harvey; Stephen Leybourne
    Abstract: We propose methods for constructing confidence sets for the timing of a break in level and/or trend that have asymptotically correct coverage for both I(0) and I(1) processes. These are based on inverting a sequence of tests for the break location, evaluated across all possible break dates. We separately derive locally best invariant tests for the I(0) and I(1) cases; under their respective assumptions, the resulting confidence sets provide correct asymptotic coverage regardless of the magnitude of the break. We suggest use of a pre-test procedure to select between the I(0)- and I(1)- based confidence sets, and Monte Carlo evidence demonstrates that our recommended procedure achieves good finite sample properties in terms of coverage and length across both I(0) and I(1) environments. An application using US macroeconomic data is provided which further evinces the value of these procedures.
    Keywords: Level break; Trend break; Stationary; Unit root; Locally best invariant test; Con?- dence sets. JEL classification: C22,
    URL: http://d.repec.org/n?u=RePEc:not:notgts:14/04&r=ecm
  5. By: Donald W.K. Andrews (Cowles Foundation, Yale University); Xiaoxia Shi (Dept. of Economics, University of Wisconsin, Madison)
    Abstract: In this paper, we construct confidence sets for models defined by many conditional moment inequalities/equalities. The conditional moment restrictions in the models can be finite, countably infinite, or uncountably infinite. To deal with the complication brought about by the vast number of moment restrictions, we exploit the manageability (Pollard (1990)) of the class of moment functions. We verify the manageability condition in five examples from the recent partial identification literature. The proposed confidence sets are shown to have correct asymptotic size in a uniform sense and to exclude parameter values outside the identified set with probability approaching one. Monte Carlo experiments for a conditional stochastic dominance example and a random-coefficients binary-outcome example support the theoretical results.
    Keywords: Asymptotic size, Conditional moment inequalities, Confidence set, Many moments, Multiple equilibria, Partial identification, Random coefficients, Stochastic dominance, Test
    JEL: C1 C2 C3
    Date: 2015–07
    URL: http://d.repec.org/n?u=RePEc:cwl:cwldpp:2010&r=ecm
  6. By: Francisco Blasques (VU University Amsterdam, the Netherlands); Siem Jan Koopman (VU University Amsterdam, the Netherlands); Katarzyna Łasak (VU University Amsterdam, the Netherlands); André Lucas (VU University Amsterdam, the Netherlands)
    Abstract: We study the performance of alternative methods for calculating in-sample confidence and out of-sample forecast bands for time-varying parameters. The in-sample bands reflect parameter uncertainty only. The out-of-sample bands reflect both parameter uncertainty and innovation uncertainty. The bands are applicable to a large class of observation driven models and a wide range of estimation procedures. A Monte Carlo study is conducted for time-varying parameter models such as generalized autoregressive conditional heteroskedasticity and autoregressive conditional duration models. Our results show clear differences between the actual coverage provided by the different methods. We illustrate our findings in a volatility analysis for monthly Standard & Poor’s 500 index returns.
    Keywords: autoregressive conditional duration; delta-method; generalized autoregressive
    JEL: C52 C53
    Date: 2015–07–09
    URL: http://d.repec.org/n?u=RePEc:tin:wpaper:20150083&r=ecm
  7. By: Oh, Dong Hwan (Board of Governors of the Federal Reserve System (U.S.)); Patton, Andrew J. (Duke University)
    Abstract: his paper presents flexible new models for the dependence structure, or copula, of economic variables based on a latent factor structure. The proposed models are particularly attractive for relatively high dimensional applications, involving fifty or more variables, and can be combined with semiparametric marginal distributions to obtain flexible multivariate distributions. Factor copulas generally lack a closed-form density, but we obtain analytical results for the implied tail dependence using extreme value theory, and we verify that simulation-based estimation using rank statistics is reliable even in high dimensions. We consider "scree" plots to aid the choice of the number of factors in the model. The model is applied to daily returns on all 100 constituents of the S&P 100 index, and we find significant evidence of tail dependence, heterogeneous dependence, and asymmetric dependence, with dependence being stronger in crashes than in booms. We also show that factor copula models provide superior estimates of some measures of systemic risk.
    Keywords: Copulas; correlation; dependence; systemic risk; tail dependence
    JEL: C31 C32 C51
    Date: 2015–05–18
    URL: http://d.repec.org/n?u=RePEc:fip:fedgfe:2015-51&r=ecm
  8. By: Franco Peracchi (Department of Economics, Georgetown University); Samantha Leorato (Department of Economics and Finance, Tor Vergata University)
    Abstract: Learning about the shape of a probability distribution, not just about its location or dispersion, is often an important goal of empirical analysis. Given a continuous random variable Y and a random vector X defined on the same probability space, the conditional distribution function (CDF) and the conditional quantile function (CQF) offer two equivalent ways of describing the shape of the conditional distribution of Y given X. To these equivalent representations correspond two alternative approaches to shape regression. One approach - distribution regression - is based on direct estimation of the conditional distribution function (CDF); the other approach - quantile regression - is instead based on direct estimation of the conditional quantile function (CQF). Since the CDF and the CQF are generalized inverses of each other, indirect estimates of the CQF and the CDF may be obtained by taking the generalized inverse of the direct estimates obtained from either approach, possibly after rearranging to guarantee monotonicity of estimated CDFs and CQFs. The equivalence between the two approaches holds for standard nonparametric estimators in the unconditional case. In the conditional case, when modeling assumptions are introduced to avoid curse-of-dimensionality problems, this equivalence is generally lost as a convenient parametric model for the CDF need not imply a convenient parametric model for the CQF, and vice versa. Despite the vast literature on the quantile regression approach, and the recent attention to the distribution regression approach, no systematic comparison of the two has been carried out yet. Our paper fills-in this gap by comparing the asymptotic properties of estimators obtained from the two approaches, both when the assumed parametric models on which they are based are correctly specified and when they are not.
    Keywords: Distribution regression; quantile regression; functional delta-method; non-separable models; influence function
    JEL: C1 C21 C25
    Date: 2015–07–10
    URL: http://d.repec.org/n?u=RePEc:geo:guwopa:gueconwpa~15-15-06&r=ecm
  9. By: Conrad, Christian; Schienle, Melanie
    Abstract: We develop a misspecification test for the multiplicative two-component GARCH-MIDAS model suggested in Engle et al. (2013). In the GARCH-MIDAS model a short-term unit variance GARCH component fluctuates around a smoothly time-varying long-term component which is driven by the dynamics of an explanatory variable. We suggest a Lagrange Multiplier statistic for testing the null hypothesis that the variable has no explanatory power. Hence, under the null hypothesis the long-term component is constant and the GARCH-MIDAS reduces to the simple GARCH model. We derive the asymptotic theory for our test statistic and investigate its finite sample properties by Monte-Carlo simulation. The usefulness of our procedure is illustrated by an empirical application to S&P 500 return data.
    Keywords: Volatility Component Models; LM test; Long-term Volatility.
    Date: 2015–07–09
    URL: http://d.repec.org/n?u=RePEc:awi:wpaper:0597&r=ecm
  10. By: Steven T. Berry (Cowles Foundation, Yale University); Philip A. Haile (Cowles Foundation, Yale University)
    Abstract: We present new results on the identifiability of a class of nonseparable nonparametric simultaneous equations models introduced by Matzkin (2008). These models combine exclusion restrictions with a requirement that each structural error enter through a "residual index." Our identification results encompass a variety of special cases allowing tradeoffs between the exogenous variation required of instruments and restrictions on the joint density of structural errors. Among these special cases are results avoiding any density restriction and results allowing instruments with arbitrarily small support.
    Keywords: Simultaneous equations, Nonseparable models, Nonparametric identification
    JEL: C3 C14
    Date: 2015–07
    URL: http://d.repec.org/n?u=RePEc:cwl:cwldpp:2008&r=ecm
  11. By: JIN SEO CHO (Yonsei University); TAE-HWAN KIM (Yonsei University); YONGCHEOL SHIN (University of York)
    Abstract: Xiao (2009) develops a novel estimation technique for quantile cointegrated time series by extending Phillips and Hansen¡¯s (1990) semiparametric approach and Saikkonen¡¯s (1991) parametrically augmented approach. This paper extends Pesaran and Shin¡¯s (1998) autoregressive distributed-lag approach into quantile regression by jointly analysing short-run dynamics and long-run cointegrating relationships across a range of quantiles. We derive the asymptotic theory and provide a general package in which the model can be estimated and tested within and across quantiles. We further affirm our theoretical results by Monte Carlo simulations. The main utilities of this analysis are demonstrated through the empirical application to the dividend policy in the U.S.
    Keywords: QARDL, Quantile Regression, Long-run Cointegrating Relationship, Dividend Smoothing, Time-varying Rolling Estimation.
    JEL: C22 G35
    Date: 2015–06
    URL: http://d.repec.org/n?u=RePEc:yon:wpaper:2015rwp-80&r=ecm
  12. By: Robert Novy-Marx
    Abstract: Strategies selected by combining multiple signals suffer severe overfitting biases, because underlying signals are typically signed such that each predicts positive in-sample returns. “Highly significant” backtested performance is easy to generate by selecting stocks on the basis of combinations of randomly generated signals, which by construction have no true power. This paper analyzes t-statistic distributions for multi-signal strategies, both empirically and theoretically, to determine appropriate critical values, which can be several times standard levels. Overfitting bias also severely exacerbates the multiple testing bias that arises when investigators consider more results than they present. Combining the best k out of n candidate signals yields a bias almost as large as those obtained by selecting the single best of n<sup>k</sup> candidate signals.
    JEL: C58 G11
    Date: 2015–07
    URL: http://d.repec.org/n?u=RePEc:nbr:nberwo:21329&r=ecm
  13. By: Christophe Gouel; Nicolas Legrand
    Abstract: We present a method to estimate jointly the parameters of a standard commodity storage model and the parameters characterizing the trend in commodity prices. This procedure allows the influence of a possible trend to be removed without restricting the model specification, and allows model and trend selection based on statistical criteria. The trend is modeled deterministically using linear or cubic spline functions of time. The results show that storage models with trend are always preferred to models without trend. They yield more plausible estimates of the structural parameters, with storage costs and demand elasticities that are more consistent with the literature. They imply occasional stockouts, whereas without trend the estimated models predict no stockouts over the sample period for most commodities. Moreover, accounting for a trend in the estimation imply price moments closer to those observed in commodity prices. Our results support the empirical relevance of the speculative storage model, and show that storage model estimations should not neglect the possibility of long-run price trends.
    Keywords: Commodity prices, non-linear dynamic models, storage, structural estimation, trend.
    JEL: C51 C52 Q11
    Date: 2015
    URL: http://d.repec.org/n?u=RePEc:drm:wpaper:2015-15&r=ecm
  14. By: Mikkel Bennedsen; Asger Lunde; Mikko S. Pakkanen
    Abstract: We introduce a simulation scheme for Brownian semistationary processes, which is based on discretizing the stochastic integral representation of the process in the time domain. We assume that the kernel function of the process is regularly varying at zero. The novel feature of the scheme is to approximate the kernel function by a power function near zero and by a step function elsewhere. The resulting approximation of the process is a combination of Wiener integrals of the power function and a Riemann sum, which is why we call this method a hybrid scheme. Our main theoretical result describes the asymptotics of the mean square error of the hybrid scheme and we observe that the scheme leads to a substantial improvement of accuracy compared to the ordinary forward Riemann-sum scheme, while having the same computational complexity. We exemplify the use of the hybrid scheme by two numerical experiments, where we examine the finite-sample properties of an estimator of the roughness parameter of a Brownian semistationary process and study Monte Carlo option pricing in the rough Bergomi model of Bayer et al. (2015), respectively.
    Date: 2015–07
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1507.03004&r=ecm
  15. By: Shibo Liu (School of Business and Economics, Loughborough University); Tom Weyman-Jones (School of Business and Economics, Loughborough University); Karligash Glass (School of Business and Economics, Loughborough University)
    Abstract: The original Morey and Morey (1999) paper was the first to explicitly link the efficient theoretical frontier of the Markowitz portfolio balance model to the concept of the efficient empirical frontier in data envelopment analysis. The contribution of this paper is to extend the application of this linked research strategy to incorporate both sampling error addressed through bootstrapping and contextual explanation of the efficiency results through statistically robust second stage analysis. This paper first applies the procedures in Morey and Morey (1999) to a new modern data set comprising a multi-year sample of investment funds and then utilises Simar-Wilson (2008) bootstrapping algorithms to develop statistical inference and confidence intervals for the indexes of efficient investment fund performance. For the second stage analysis, robust-OLS regression, Tobit models and Papke-Wooldridge (PW) models are conducted and compared to evaluate contextual variables affecting the performance of investment funds.
    Keywords: nonlinear-DEA, portfolios, bootstrapping, second stage DEA
    JEL: C14 G1 G23
    Date: 2015–01
    URL: http://d.repec.org/n?u=RePEc:lbo:lbowps:2015_01&r=ecm
  16. By: Biørn, Erik (Dept. of Economics, University of Oslo); Han, Xuehui (Economics and Research Department)
    Abstract: GMM estimation of autoregressive panel data equations in error-ridden variables when the noise has memory, is considered. The impact of variation in the memory length in signal and noise spread and in the degree of individual heterogeneity are discussed with respect to finite sample bias, using Monte Carlo simulations. Also explored are also the impact of the strength of autocorrelation and the size of the IV set. GMM procedures using IVs in differences on equations in levels, in general perform better in small samples than procedures using IVs in levels on equations in differences. A case-study of the impact of Foreign Direct Investment (FDI) on GDP, inter alia, contrasting the manufacturing and the service sector, based on country panel data supplements the simulation results.
    Keywords: Panel data; Measurement error; ARMA; GMM; Error memory; Monte Carlo; Foreign Direct Investment; Economic development; Country panel
    JEL: C21 C23 C31 C33 O11 O14
    Date: 2015–02–12
    URL: http://d.repec.org/n?u=RePEc:hhs:osloec:2015_004&r=ecm
  17. By: Vassili Bazinas (Dept of Economics Oxford University); Bent Nielsen (Dept of Economics, Institute for Economic Modeling at the Oxford Martin School, Oxford University)
    Abstract: We propose a method to explore the causal transmission of a catalyst variable through two endogenous variables of interest. The method is based on the reduced-form system formed from the conditional distribution of the two endogenous variables given the catalyst. The method combines elements from instru- mental variable analysis and Cholesky decomposition of structural vector autoregressions. We give conditions for uniqueness of the causal transmission.
    Date: 2015–07–13
    URL: http://d.repec.org/n?u=RePEc:nuf:econwp:1507&r=ecm
  18. By: Joshua B. Miller; Adam Sanjurjo
    Abstract: We find a subtle but substantial bias in a standard measure of the conditional dependence of present outcomes on streaks of past outcomes in sequential data. The mechanism is driven by a form of selection bias, which leads to an underestimate of the true conditional probability of a given outcome when conditioning on prior outcomes of the same kind. The biased measure has been used prominently in the literature that investigates incorrect beliefs in sequential decision making—most notably the Gambler’s Fallacy and the Hot Hand Fallacy. Upon correcting for the bias, the conclusions of some prominent studies in the literature are reversed. The bias also provides a structural explanation of why the belief in the law of small numbers persists, as repeated experience with finite sequences can only reinforce these beliefs, on average. JEL Classification Numbers: C12; C14; C18;C19; C91; D03; G02. Keywords: Law of Small Numbers; Alternation Bias; Negative Recency Bias; Gambler’s Fallacy; Hot Hand Fallacy; Hot Hand Effect; Sequential Decision Making; Sequential Data; Selection Bias; Finite Sample Bias; Small Sample Bias.
    Date: 2015
    URL: http://d.repec.org/n?u=RePEc:igi:igierp:552&r=ecm
  19. By: Fengler, Matthias R.; Herwartz, Helmut
    Abstract: In highly integrated markets, news spreads at a fast pace and bedevils risk monitoring and optimal asset allocation. We therefore propose global and disaggregated measures of variance transmission that allow one to assess spillovers locally in time. Key to our approach is the vector ARMA representation of the second-order dynamics of the popular BEKK model. In an empirical application to a four-dimensional system of US asset classes - equity, fixed income, foreign exchange and commodities - we illustrate the second-order transmissions at various levels of (dis)aggregation. Moreover, we demonstrate that the proposed spillover indices are informative on the value-at-risk violations of portfolios composed of the considered asset classes.
    Keywords: Multivariate GARCH, spillover index, value-at-risk, variance spillovers, variance decomposition
    JEL: C32 C58 F3 G1
    Date: 2015–07
    URL: http://d.repec.org/n?u=RePEc:usg:econwp:2015:17&r=ecm

This nep-ecm issue is ©2015 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.