nep-ecm New Economics Papers
on Econometrics
Issue of 2014‒03‒22
nineteen papers chosen by
Sune Karlsson
Orebro University

  1. Targeting estimation of CCC-Garch models with infinite fourth moments By Rasmus Søndergaard Pedersen
  2. Robust Hypothesis Tests for M-Estimators with Possibly Non-differentiable Estimating Functions By Wei-Ming Lee; Yu-Chin Hsu; Chung-Ming Kuan
  3. Asymptotic Refinements of a Misspecification-Robust Bootstrap for GEL Estimators By Seojeong Lee
  4. CCE estimation of factor-augmented regression models with more factors than observables By Urbain J.R.Y.J.; Karabiyik H.; Westerlund J.
  5. A multivariate invariance principle for modified wild bootstrap methods with an application to unit root testing By Smeekes S.; Urbain J.R.Y.J.
  6. A New Bootstrap Test for the Validity of a Set of Marginal Models for Multiple Dependent Time Series: an Application to Risk Analysis By David Ardia; Lukasz Gatarek; Lennart F. hoogerheide
  7. Outliers in multivariate Garch models By Aurea Grané; Belén Martín-Barragán; Helena Veiga
  8. A criterion for the number of factors in a data-rich environment By Reijer, Ard H.J. de; Jacobs, Jan P.A.M.; Otter, Pieter W.
  9. An importance sampling algorithm for copula models in insurance By Philipp Arbenz; Mathieu Cambou; Marius Hofert
  10. A simple consistent test of conditional symmetry in symmetrically trimmed tobit models By Tao Chen; Gautam Tripathi
  11. On trend-cycle-seasonal interactions By Irma Hindrayanto; Jan Jacobs; Denise Osborn
  12. Modeling volatility with Range-based Heterogeneous Autoregressive Conditional Heteroskedasticity model By Tomasz Skoczylas
  13. Outliers and Persistence in Threshold Autoregressive Processes: A Puzzle? By Yamin Ahmad; Luiggi Donayre
  14. Estimation of Factor-Augmented Panel Regressions with Weakly Influential Factors By Westerlund, Joakim; Reese, Simon
  15. Temporal Aggregation of Random Walk Processes and Implications for Asset Prices By Yamin Ahmad; Ivan Paya
  16. The Quantity and Quality of Children: A Semi-Parametric Bayesian IV Approach By Frühwirth-Schnatter, Sylvia; Halla, Martin; Posekany, Alexandra; Pruckner, Gerald J.; Schober, Thomas
  17. Least quartic Regression Criterion with Application to Finance By Giuseppe arbia
  18. Exploiting the Choice-Consumption Mismatch: A New Approach to Disentangle State Dependence and Heterogeneity By K. Sudhir; Nathan Yang
  19. Instrumental Variables: An Econometrician's Perspective By Guido W. Imbens

  1. By: Rasmus Søndergaard Pedersen (Department of Economics, Copenhagen University)
    Abstract: As an alternative to quasi-maximum likelihood, targeting estimation is a much applied estimation method for univariate and multivariate GARCH models. In terms of variance targeting estimation recent research has pointed out that at least finite fourth-order moments of the data generating process is required if one wants to perform inference in GARCH models relying on asymptotic normality of the estimator, see Pedersen and Rahbek (2014) and Francq et al. (2011). Such moment conditions may not be satisfied in practice for financial returns highlighting a large drawback of variance targeting estimation. In this paper we consider the large-sample properties of the variance targeting estimator for the multivariate extended constant conditional correlation GARCH model when the distribution of the data generating process has infinite fourth moments. Using non-standard limit theory we derive new results for the estimator stating that its limiting distribution is multivariate stable. The rate of consistency of the estimator is slower than squareroot T (as obtained by the quasi-maximum likelihood estimator) and depends on the tails of the data generating process
    Keywords: Targeting; variance targeting; multivariate GARCH; constant conditional correlation; asymptotic theory; time series, multivariate regular variation, stable distributions.
    JEL: C32 C51 C58
    Date: 2014–02
    URL: http://d.repec.org/n?u=RePEc:kud:kuiedp:1404&r=ecm
  2. By: Wei-Ming Lee (Department of Economics National Chung Cheng University); Yu-Chin Hsu (Institute of Economics, Academia Sinica, Taipei, Taiwan); Chung-Ming Kuan (Department of Finance National Taiwan University)
    Abstract: We propose a new robust hypothesis test for (possibly nonlinear) constraints on Mestimators with possibly non-differentiable estimating functions. The proposed test employs a random normalizing matrix computed from recursive M-estimators to eliminate the nuisance parameters arising from the asymptotic covariance matrix. It does not require consistent estimation of any nuisance parameters, in contrast with the conventional heteroskedasticity autocorrelation consistent (HAC)-type test and the KVB-type test of Kiefer, Vogelsang, and Bunzel (2000). Our test reduces to the KVB-type test in simple location models with OLS estimation, so the error in rejection probability of our test in a Gaussian location model is OIP(T−1 log T). We discuss robust testing in quantile regression, and censored regression models in details. In simulation studies, we find that our test has better size control and better finite sample power than the HAC-type and KVB-type tests.
    Keywords: censored regression, generalized method of moments, robust hypothesis testing, KVB approach, M-estimator, quantile regression
    JEL: C12 C22
    Date: 2014–03
    URL: http://d.repec.org/n?u=RePEc:sin:wpaper:14-a004&r=ecm
  3. By: Seojeong Lee (School of Economics, Australian School of Business, the University of New South Wales)
    Abstract: I propose a nonparametric iid bootstrap procedure for the empirical likelihood, the exponential tilting, and the exponentially tilted empirical likelihood estimators that achieves sharp asymptotic refinements for t tests and confidence intervals based on such estimators. Furthermore, the proposed bootstrap is robust to model misspecification, i.e., it achieves asymptotic refinements regardless of whether the assumed moment condition model is correctly specified or not. This result is new, because asymptotic refinements of the bootstrap based on these estimators have not been established in the literature even under correct model specification. Monte Carlo experiments are conducted in dynamic panel data setting to support the theoretical finding. As an application, bootstrap confidence intervals for the returns to schooling of Hellerstein and Imbens (1999) are calculated. The returns to schooling may be higher.
    Keywords: generalized empirical likelihood, bootstrap, asymptotic refinement, model misspecification
    JEL: C14 C15 C31 C33
    Date: 2014–01
    URL: http://d.repec.org/n?u=RePEc:swe:wpaper:2014-02&r=ecm
  4. By: Urbain J.R.Y.J.; Karabiyik H.; Westerlund J. (GSBE)
    Abstract: This paper considers estimation of factor-augmented panel data regression models with homogenous slope coefficients. One of the most popular approaches towards this end is the pooled common correlated effects CCE estimator of Pesaran 2006. For this estimator to be consistent at the usual sqrt-NT rate, where N and N denote the number of cross-section and time series observations, respectively, the number of factors cannot be larger than the number of observables. This is a problem in the typical application involving only a small number of regressors. The current paper proposes a simple extension to the CCE procedure by which the requirement can be relaxed. The CCE approach is based on taking the cross-section average of the observables as an estimator of the common factors. The idea put forth in the current paper is to consider not only the average but also other cross-section combinations. The asymptotic properties of the resulting combination-augmented CCE C3E estimator are provided and verified in small samples using Monte Carlo simulation.
    Keywords: Hypothesis Testing: General; Estimation: General; Multiple or Simultaneous Equation Models: Models with Panel Data; Longitudinal Data; Spatial Time Series;
    JEL: C12 C13 C33
    Date: 2014
    URL: http://d.repec.org/n?u=RePEc:unm:umagsb:2014007&r=ecm
  5. By: Smeekes S.; Urbain J.R.Y.J. (GSBE)
    Abstract: In this paper we consider several modified wild bootstrap methods that, additionally to heteroskedasticity, can take dependence into account. The modified wild bootstrap methods are shown to correctly replicate an invariance principle for multivariate time series that are characterized by general forms of unconditional heteroskedasticity, or nonstationary volatility, as well as dependence within and between different elements of the time series. The invariance principle is then applied to derive the asymptotic validity of the wild bootstrap methods for unit root testing in a multivariate setting. The resulting tests, which can also be interpreted as panel unit root tests, are valid under more general assumptions than most current tests used in the literature. A simulation study is performed to evaluate the small sample properties of the bootstrap unit root tests.
    Keywords: Statistical Simulation Methods: General; Multiple or Simultaneous Equation Models: Time-Series Models; Dynamic Quantile Regressions; Dynamic Treatment Effect Models;
    JEL: C15 C32
    Date: 2014
    URL: http://d.repec.org/n?u=RePEc:unm:umagsb:2014008&r=ecm
  6. By: David Ardia; Lukasz Gatarek; Lennart F. hoogerheide
    Abstract: A novel simulation-based methodology is proposed to test the validity of a set of marginal time series models, where the dependence structure between the time series is taken ‘directly’ from the observed data. The procedure is useful when one wants to summarize the test results for several time series in one joint test statistic and p-value. The proposed test method can have higher power than a test for a univariate time series, especially for short time series. Therefore our test for multiple time series is particularly useful if one wants to assess Value-at-Risk (or Expected Shortfall) predictions over a small time frame (e.g., a crisis period). We apply our method to test GARCH model specifications for a large panel data set of stock returns.
    Keywords: Bootstrap test, GARCH, Marginal models,Multiple time series, Value-at-Risk
    JEL: C1 C12 C22 C44
    Date: 2014
    URL: http://d.repec.org/n?u=RePEc:lvl:lacicr:1413&r=ecm
  7. By: Aurea Grané; Belén Martín-Barragán; Helena Veiga
    Abstract: Outliers of moderate magnitude cause large changes in financial time series of prices andreturns and affect both the estimation of parameters and volatilities when fitting a GARCH-typemodel. The multivariate setting is still to be studied, but similar biases and impacts oncorrelation dynamics are believed to exist. The accurate estimation of the correlation structure iscrucial in many applications, such as portfolio allocation and risk management. This paperfocuses on these issues by studding the impact of additive outliers (isolated and patches of leveloutliers and volatility outliers) on the estimation of correlations when fitting well knownmultivariate GARCH models and by proposing a general detection algorithm based on waveletsthat can be applied to a large class of multivariate volatility models. This procedure can be alsointerpreted as a model miss-specification test since it is based on residual diagnostics. Theeffectiveness of the new proposal is evaluated by an intensive Monte Carlo study before it isapplied to daily stock market indices. The simulation studies show that correlations are highlyaffected by the presence of outliers and that the new method is both effective and reliable, sinceit detects very few false outliers.
    Keywords: Additive Outliers, Correlations, Volatilities, Wavelets
    JEL: C10 C13 C53 C58 G17
    Date: 2014–02
    URL: http://d.repec.org/n?u=RePEc:cte:wsrepe:ws140503&r=ecm
  8. By: Reijer, Ard H.J. de; Jacobs, Jan P.A.M.; Otter, Pieter W. (Groningen University)
    Abstract: This paper derives a new criterion for the determination of the number of factors in static approximate factor models, that is strongly associated with the scree test. Our criterion looks for the number of eigenvalues for which the difference between adjacent eigenvalue-component number blocks is maximized. Monte Carlo experiments compare the properties of our criterion to the Edge Distribution (ED) estimator of Onatski (2010) and the two eigenvalue ratio estimators of Ahn and Horenstein (2013). Our criterion outperforms the latter two for all sample sizes and the ED estimator of Onatski (2010) for samples up to 300 variables/observations
    Date: 2014
    URL: http://d.repec.org/n?u=RePEc:dgr:rugsom:14008-eef&r=ecm
  9. By: Philipp Arbenz; Mathieu Cambou; Marius Hofert
    Abstract: An importance sampling algorithm for copula models is introduced. The method improves Monte Carlo estimators when the functional of interest depends mainly on the behaviour of the underlying random vector when at least one of the components is large. Such problems often arise from dependence models in finance and insurance. The importance sampling framework we propose is general and can be easily implemented for all classes of copula models from which sampling is feasible. We show how the proposal distribution can be optimized to reduce the sampling error. In a case study inspired by a typical multivariate insurance application, we obtain variance reduction factors between 10 and 20 in comparison to standard Monte Carlo estimators.
    Date: 2014–03
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1403.4291&r=ecm
  10. By: Tao Chen (Department of Economics, University of Waterloo, Ontario, Canada); Gautam Tripathi (CREA, Université de Luxembourg)
    Abstract: We propose a "weighted and sample-size adjusted" Kolmogorov-Smirnov type statistic to test the assumption of conditional symmetry maintained in the symmetrically trimmed least-squares (STLS) approach of Powell (1986b), which is widely used to estimate censored or truncated regression models without making distributional assumptions. The statistic proposed here is consistent and computationally easy to implement because, unlike traditional Kolmogorov-Smirnov statistics, it is not optimized over an uncountable set. Moreover, it does not require any nonparametric smoothing, although we test the validity of a conditional feature. We also propose a bootstrap procedure to obtain the p-values and critical values that are required to carry out the test in practical applications. Results from a simulation study suggest that our test can work very well even in small to moderately sized samples. As an empirical illustration, we apply our test to two datasets that have been used in the literature to estimate censored regression models using Powell's STLS approach, to check whether the assumption of conditional symmetry is supported by these datasets.
    Date: 2014
    URL: http://d.repec.org/n?u=RePEc:luc:wpaper:14-04&r=ecm
  11. By: Irma Hindrayanto; Jan Jacobs; Denise Osborn
    Abstract: Traditional unobserved component models assume that the trend, cycle and seasonal components of an individual time series evolve separately over time. Although this assumption has been relaxed in recent papers that focus on trend-cycle interactions, it remains at the core of all seasonal adjustment methods applied by official statistical agencies around the world. The present paper develops an unobserved components model that permits non-zero correlations between seasonal and non-seasonal shocks, hence allowing testing of the uncorrelated assumption that is traditionally imposed. Identification conditions for estimation of the parameters are discussed, while applications to observed time series illustrate the model and its implications for seasonal adjustment.
    Keywords: trend-cycle-seasonal decomposition; unobserved components; state-space models; seasonal adjustment; global real economic activity; unemployment
    JEL: C22 E24 E32 E37 F01
    Date: 2014–03
    URL: http://d.repec.org/n?u=RePEc:dnb:dnbwpp:417&r=ecm
  12. By: Tomasz Skoczylas (Faculty of Economic Sciences, University of Warsaw)
    Abstract: In this paper a new ARCH-type volatility model is proposed. The Range-based Heterogeneous Autoregressive Conditional Heteroskedasticity (RHARCH) model draws inspiration from Heterogeneous Autoregressive Conditional Heteroskedasticity presented by Muller et al. (1995), but employs more efficient, range-based volatility estimators instead of simple squared returns in conditional variance equation. In the first part of this research range-based volatility estimators (such as Parkinson, or Garman-Klass estimators) are reviewed, followed by derivation of the RHARCH model. In the second part of this research the RHARCH model is compared with selected ARCH-type models with particular emphasis on forecasting accuracy. All models are estimated using data containing EURPLN spot rate quotation. Results show that RHARCH model often outperforms return-based models in terms of predictive abilities in both in-sample and out-of-sample periods. Also properties of standardized residuals are very encouraging in case of the RHARCH model.
    Keywords: volatility modelling, volatility forecasting, ARCH, range-based volatility estimators, heterogeneity of volatility
    JEL: C13 C22 C53
    Date: 2014
    URL: http://d.repec.org/n?u=RePEc:war:wpaper:2014-06&r=ecm
  13. By: Yamin Ahmad (Department of Economics, University of Wisconsin - Whitewater); Luiggi Donayre (Department of Economics, University of Minnesota - Duluth)
    Abstract: We conduct Monte Carlo simulations to investigate the effects of outlier observations on the properties of linearity tests against threshold autoregressive (TAR) processes. By considering different specifications and levels of persistence of the data generating processes, we find that outliers distort the size of the test and that the distortion increases with the level of persistence. However, contrary to what one might expect, we also find that larger outliers could help improve the power of the test in the case of persistent TAR processes.
    Keywords: Outliers, Persistence, Monte Carlo Simulations, Threshold Autoregressionn, Size, Power
    JEL: C15 C22
    Date: 2014–03
    URL: http://d.repec.org/n?u=RePEc:uww:wpaper:14-02&r=ecm
  14. By: Westerlund, Joakim (Deakin University, Australia); Reese, Simon (Department of Economics, Lund University)
    Abstract: The use of factor-augmented panel regressions has become very popular in recent years. Existing methods for such regressions require that the common factors are strong, such that their cumulative loadings rise proportionally to the number of cross-sectional units, which of course need not be the case in practice. Motivated by this, the current paper offers an indepth analysis of the effect of non-strong factors on two of the most popular estimators for factor-augmented regressions, namely, principal components (PC) and common correlated effects (CCE). Conditions for consistency and asymptotic normality are established, which are shown to depend critically on a so far overlooked relationship between factor strength and the relative expansion rate of the dimensions of the panel.
    Keywords: Non-strong common factors; factor-augmented panel regressions; common factor models
    JEL: C12 C13 C33
    Date: 2014–02–13
    URL: http://d.repec.org/n?u=RePEc:hhs:lunewp:2014_008&r=ecm
  15. By: Yamin Ahmad (Department of Economics, University of Wisconsin - Whitewater); Ivan Paya (Department of Economics, Lancaster University Management School)
    Abstract: This paper examines the impact of time averaging and interval sampling data assuming that the data generating process for a given series follows a random walk with uncorrelated increments. We provide expressions for the corresponding variances, and covariances, for both the levels and differences of the aggregated series, demonstrating how the degree of temporal aggregation impacts these particular properties. Moreover, we analytically derive any differences that arise between the aggregated series and its disaggregated counterpart, and show that they can be decomposed into a distortionary and small sample effect. We also provide exact expressions for the variance and sharpe ratios, and correlation coefficients for any level of aggregation. We discuss our results in the context of asset prices, which have utilized these extensively.
    Keywords: Temporal Aggregation, Random Walk, Variance Ratio, Sharpe Ratio
    JEL: F47 C15 C32
    Date: 2014–01
    URL: http://d.repec.org/n?u=RePEc:uww:wpaper:14-01&r=ecm
  16. By: Frühwirth-Schnatter, Sylvia (Vienna University of Economics and Business); Halla, Martin (University of Linz); Posekany, Alexandra (Vienna University of Economics and Business); Pruckner, Gerald J. (University of Linz); Schober, Thomas (University of Linz)
    Abstract: Prior empirical research on the theoretically proposed interaction between the quantity and the quality of children builds on exogenous variation in family size due to twin births and focuses on human capital outcomes. The typical finding can be described as a statistically nonsignificant two-stage least squares (2SLS) estimate, with substantial standard errors. We regard these conclusions of no empirical support for the quantity-quality trade-off as premature and, therefore, extend the empirical approach in two ways. First, we add health as an additional outcome dimension. Second, we apply a semi-parametric Bayesian IV approach for econometric inference. Our estimation results substantiate the finding of a zero effect: we provide estimates with an increased precision by a factor of approximately twenty-three, for a broader set of outcomes.
    Keywords: quantity-quality model of fertility, family size, human capital, health, semi-parametric Bayesian IV approach
    JEL: J13 C26 C11 I20 J20 I10
    Date: 2014–03
    URL: http://d.repec.org/n?u=RePEc:iza:izadps:dp8024&r=ecm
  17. By: Giuseppe arbia
    Abstract: This article proposes a new method for the estimation of the parameters of a simple linear regression model which accounts for the role of co-moments in non-Gaussian distributions being based on the minimization of a quartic loss function. Although the proposed method is very general, we examine its application to finance. In fact, in this field the contribution of the co-moments in explaining the return-generating process is of paramount importance when evaluating the systematic risk of an asset within the framework of the Capital Asset Pricing Model (CAPM). The suggested new method contributes to this literature by showing that, in the presence of non-normality, the regression slope can be expressed as a function of the co-kurtosis between the returns of a risky asset and the market proxy. The paper provides an illustration of the method based on some empirical financial data referring to 40 industrial sector assets rates of the Italian stock market.
    Date: 2014–03
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1403.4171&r=ecm
  18. By: K. Sudhir (Cowles Foundation and Yale School of Management); Nathan Yang (Yale School of Management)
    Abstract: This paper offers a new identification strategy for disentangling structural state dependence from unobserved heterogeneity in preferences. Our strategy exploits market environments where there is a choice-consumption mismatch. We first demonstrate the effectiveness of our identification strategy in obtaining unbiased state dependence estimates via Monte Carlo analysis and highlight its superiority relative to the extant choice-set variation based approach. In an empirical application that uses data of repeat transactions from the car rental industry, we find evidence of structural state dependence, but show that state dependence effects may be overstated without exploiting the choice-consumption mismatches that materialize through free upgrades.
    Keywords: Consumer dynamics, Heterogeneity, Quasi-experiment econometrics, Service industry, State dependence
    JEL: C1 C5 L00 L80 M2 M3
    Date: 2014–03
    URL: http://d.repec.org/n?u=RePEc:cwl:cwldpp:1941&r=ecm
  19. By: Guido W. Imbens
    Abstract: I review recent work in the statistics literature on instrumental variables methods from an econometrics perspective. I discuss some of the older, economic, applications including supply and demand models and relate them to the recent applications in settings of randomized experiments with noncompliance. I discuss the assumptions underlying instrumental variables methods and in what settings these may be plausible. By providing context to the current applications a better understanding of the applicability of these methods may arise.
    JEL: C01
    Date: 2014–03
    URL: http://d.repec.org/n?u=RePEc:nbr:nberwo:19983&r=ecm

This nep-ecm issue is ©2014 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.