nep-ecm New Economics Papers
on Econometrics
Issue of 2013‒06‒04
twelve papers chosen by
Sune Karlsson
Orebro University

  1. An almost closed form estimator for the EGARCH model By HAFNER, Christian; LINTON, Oliver
  2. Smooth Minimum Distance Estimation and Testing with Conditional Estimating Equations: Uniform in Bandwidth Theory By Lavergne, Pascal; Patilea, Valentin
  3. Is Spatial Bootstrapping a Panacea for Valid Inference? By Torben Klarl
  4. Sieve Quasi Likelihood Ratio Inference on Semi/nonparametric Conditional Moment Models By Xiaohong Chen; Demian Pouzo
  5. Bootstrap Fractional Integration Tests in Heteroskedastic ARFIMA Models By Giuseppe Cavaliere; Morten Ørregaard Nielsen; A.M. Robert Taylor
  7. A Jarque-Bera test for sphericity of a large-dimensional covariance matrix By Glombek, Konstantin
  8. Semi-automatic Non-linear Model selection By Jennifer Castle; David Hendry
  9. Forecasting Value-at-Risk using Block Structure Multivariate Stochastic Volatility Models By Manabu Asai; Massimiliano Caporin; Michael McAleer
  10. Risk Measure Estimation On Fiegarch Processes By Taiane S. Prass; S\'ilvia R. C. Lopes
  11. Ten Things You Should Know About DCC By Caporin, M.; McAleer, M.J.
  12. Explosive Bubble Modelling by Noncausal Process By Christian Gouriéroux; Jean-Michel Zakoian

  1. By: HAFNER, Christian (Université catholique de Louvain, CORE & ISBA, Belgium); LINTON, Oliver (Faculty of Economics, Cambridge University, UK)
    Abstract: The EGARCH is a popular model for discrete time volatility since it allows for asymmetric effects and naturally ensures positivity even when including exogenous variables. Estimation and inference is usually done via maximum likelihood. Although some progress has been made recently, a complete distribution theory of MLE for EGARCH models is still missing. Furthermore, the estimation procedure itself may be highly sensitive to starting values, the choice of numerical optimation algorithm, etc. We present an alter- native estimator that is available in a simple closed form and which could be used, for example, as starting values for MLE. The estimator of the dynamic parameter is inde- pendent of the innovation distribution. For the other parameters we assume that the innovation distribution belongs to the class of Generalized Error Distributions (GED), profiling out its parameter in the estimation procedure. We discuss the properties of the proposed estimator and illustrate its performance in a simulation study.
    Keywords: autocorrelations, generalized error distribution, method of moments estimator, Newton-Raphson
    JEL: C12 C13 C14
    Date: 2013–05–22
  2. By: Lavergne, Pascal; Patilea, Valentin
    Abstract: We study the influence of a bandwidth parameter in inference with conditional estimating equations. In that aim, we propose a new class of smooth minimum distance estimators and we develop a theory that focuses on uniformity in bandwidth. We establish a vn-asymptotic representation of our estimator as a process indexed by a bandwidth that can vary within a wide range including bandwidths independent of the sample size. We develop an efficient version of our estimator. We also study its behavior in misspecified models. We develop a procedure based on a distance metric statistic for testing restrictions on parameters as well as a bootstrap technique to account for the bandwidth’s influence. Our new methods are simple to implement, apply to non-smooth problems, and perform well in our simulations.
    Keywords: Semiparametric Estimation, Conditional Estimating Equations, Smoothing Methods, Asymptotic Efficiency, Hypothesis Testing, Bootstrap.
    JEL: C12 C14
    Date: 2013–03
  3. By: Torben Klarl (University of Augsburg, Department of Economics)
    Abstract: Bootstrapping methods have so far been rarely used to evaluate spatial data sets. Based on an extensive Monte Carlo study we find that also for spatial, cross-sectional data, the wild bootstrap test proposed by Davidson and Flachaire (2008) based on restricted residuals clearly outperforms asymptotic as well as competing bootstrap tests, like the pairs bootstrap.
    Keywords: Spatial econometrics, Paired bootstrap, Wild bootstrap, Parameter inference
    JEL: C18 C21 R11
    Date: 2013–05
  4. By: Xiaohong Chen (Cowles Foundation, Yale University); Demian Pouzo (Dept. of Economics, University of California, Berkeley)
    Abstract: This paper considers inference on functionals of semi/nonparametric conditional moment restrictions with possibly nonsmooth generalized residuals. These models belong to the difficult (nonlinear) ill-posed inverse problems with unknown operators, and include all of the (nonlinear) nonparametric instrumental variables (IV) as special cases. For these models it is generally difficult to verify whether a functional is regular (i.e., root-n estimable) or irregular (i.e., slower than root-n estimable). In this paper we provide computationally simple, unified inference procedures that are asymptotically valid regardless of whether a functional is regular or irregular. We establish the following new results: (1) the asymptotic normality of the plug-in penalized sieve minimum distance (PSMD) estimators of the (possibly irregular) functionals; (2) the consistency of sieve variance estimators of the plug-in PSMD estimators; (3) the asymptotic chi-square distribution of an optimally weighted sieve quasi likelihood ratio (SQLR) statistic; (4) the asymptotic tight distribution of a possibly non-optimally weighted SQLR statistic; (5) the consistency of the nonparametric bootstrap and the weighted bootstrap (possibly non-optimally weighted) SQLR and sieve Wald statistics, which are proved under virtually the same conditions as those for the original-sample statistics. Small simulation studies and an empirical illustration of a nonparametric quantile IV regression are presented.
    Keywords: Nonlinear nonparametric instrumental variables; Penalized sieve minimum distance; Irregular functional; Sieve Riesz representer; Sieve quasi likelihood ratio; Asymptotic normality; Bootstrap; Sieve variance estimator
    Date: 2013–05
  5. By: Giuseppe Cavaliere (University of Bologna); Morten Ørregaard Nielsen (Queen's University and CREATES); A.M. Robert Taylor (University of Nottingham)
    Abstract: We propose bootstrap implementations of the asymptotic Wald, likelihood ratio and Lagrange multiplier tests for the order of integration of a fractionally integrated time series. Our main purpose in doing so is to develop tests which are robust to both conditional and unconditional heteroskedasticity of a quite general and unknown form in the shocks. We show that neither the asymptotic tests nor the analogues of these which obtain from using a standard i.i.d. bootstrap admit pivotal asymptotic null distributions in the presence of heteroskedasticity, but that the corresponding tests based on the wild bootstrap principle do. An heteroskedasticity-robust Wald test, based around a sandwich estimator of the variance, is also shown to deliver asymptotically pivotal inference under the null, and we show that it can be successfully bootstrapped using either i.i.d. resampling or the wild bootstrap. We quantify the dependence of the asymptotic size and local power of the asymptotic tests on the degree of heteroskedasticity present. An extensive Monte Carlo simulation study demonstrates that significant improvements in finite sample behaviour can be obtained by the bootstrap vis-à-vis the corresponding asymptotic tests in both heteroskedastic and homoskedastic environments. The results also suggest that a bootstrap algorithm based on model estimates obtained under the null hypothesis is preferable to one which uses unrestricted model estimates.
    Keywords: Bootstrap, conditional heteroskedasticity, fractional integration, likelihood-based inference, unconditional heteroskedasticity
    JEL: C12 C22
    Date: 2013–05
  6. By: Bruce Meyer; Nikolas Mittag
    Abstract: We derive the asymptotic bias from misclassification of the dependent variable in binary choice models. Measurement error is necessarily non-classical in this case, which leads to bias in linear and non-linear models even if only the dependent variable is mismeasured. A Monte Carlo study and an application to food stamp receipt show that the bias formulas are useful to analyze the sensitivity of substantive conclusions, to interpret biased coefficients and imply features of the estimates that are robust to misclassification. Using administrative records linked to survey data as validation data, we examine estimators that are consistent under misclassification. They can improve estimates if their assumptions hold, but can aggravate the problem if the assumptions are invalid. The estimators differ in their robustness to such violations, which can be improved by incorporating additional information. We propose tests for the presence and nature of misclassification that can help to choose an estimator.
    Keywords: measurement error; binary choice models; program take-up; food stamps.
    JEL: C18 C81 D31 I38
    Date: 2013–05
  7. By: Glombek, Konstantin
    Abstract: This article provides a new test for sphericity of the covariance matrix of a d-dimensional multinormal population X ∼ Nd(µ,Σ). This test is applicable if the sample size, n + 1, and d both go to infinity while d/n → y ∈ (0,∞), provided that the limits of tr(Σk)/d, k = 1,...,8, are finite. The main idea of this test is to check whether the empirical eigenvalue distribution of a suitably standardized sample covariance matrix obeys the semicircle law. Due to similarities of the semicircle law to the normal distribution, the proposed test statistic is of the type of the Jarque-Bera test statistic. Simulation results show that the new sphericity test outperforms the tests from the current literature for certain local alternatives if y is small. --
    Keywords: Test for covariance matrix,High-dimensional data,Spectral distribution,Semicircle law,Free cumulant,Jarque-Bera test
    Date: 2013
  8. By: Jennifer Castle; David Hendry
    Abstract: We consider model selection for non-linear dynamic equations with more candidate variables than observations, based on a general class of non-linear-in-the-variables functions, addressing possible location shifts by impulse-indicator saturation.  After an automatic search delivers a simplified congruent terminal model, an encompassing test can be implemented against an investigator's preferred non-linear function.  When that is non-linear in the parameters, such as a threshold model, the overall approach can only be semi-automatic.  The method is applied to re-analyze an empirical model of real wages in the UK over 1860-2004, updated and extended to 2005-2011 for forecast evaluation.
    Keywords: Non-linear models, location shifts, model selection, autometrics, impulse-indicator saturation
    JEL: C51 C22
    Date: 2013–05–16
  9. By: Manabu Asai (Soka University, Japan); Massimiliano Caporin (University of Padova, Italy); Michael McAleer (Erasmus University Rotterdam, The Netherlands, Complutense University of Madrid, Spain, and Kyoto University, Japan)
    Abstract: Most multivariate variance or volatility models suffer from a common problem, the “curse of dimensionality”. For this reason, most are fitted under strong parametric restrictions that reduce the interpretation and flexibility of the models. Recently, the literature has focused on multivariate models with milder restrictions, whose purpose is to combine the need for interpretability and efficiency faced by model users with the computational problems that may emerge when the number of assets can be very large. We contribute to this strand of the literature by proposing a block-type parameterization for multivariate stochastic volatility models. The empirical analysis on stock returns on the US market shows that 1% and 5 % Value-at-Risk thresholds based on one-step-ahead forecasts of covariances by the new specification are satisfactory for the period including the Global Financial Crisis.
    Keywords: block structures; multivariate stochastic volatility; curse of dimensionality; leverage effects; multi-factors; heavy-tailed distribution
    JEL: C32 C51 C10
    Date: 2013–05–27
  10. By: Taiane S. Prass; S\'ilvia R. C. Lopes
    Abstract: We consider the Fractionally Integrated Exponential Generalized Autoregressive Conditional Heteroskedasticity process, denoted by FIEGARCH(p,d,q), introduced by Bollerslev and Mikkelsen (1996). We present a simulated study regarding the estimation of the risk measure $VaR_p$ on FIEGARCH processes. We consider the distribution function of the portfolio log-returns (univariate case) and the multivariate distribution function of the risk-factor changes (multivariate case). We also compare the performance of the risk measures $VaR_p$, $ES_p$ and MaxLoss for a portfolio composed by stocks of four Brazilian companies.
    Date: 2013–05
  11. By: Caporin, M.; McAleer, M.J.
    Abstract: The purpose of the paper is to discuss ten things potential users should know about the limits of the Dynamic Conditional Correlation (DCC) representation for estimating and forecasting time-varying conditional correlations. The reasons given for caution about the use of DCC include the following: DCC represents the dynamic conditional covariances of the standardized residuals, and hence does not yield dynamic conditional correlations; DCC is stated rather than derived; DCC has no moments; DCC does not have testable regularity conditions; DCC yields inconsistent two step estimators; DCC has no asymptotic properties; DCC is not a special case of GARCC, which has testable regularity conditions and standard asymptotic properties; DCC is not dynamic empirically as the effect of news is typically extremely small; DCC cannot be distinguished empirically from diagonal BEKK in small systems; and DCC may be a useful filter or a diagnostic check, but it is not a model.
    Keywords: conditional correlations;regularity conditions;conditional covariances;BEKK;DCC;GARCC;assumed properties;asymptotic properties;derived model;diagnostic check;filter;moments;stated representation;two step estimators
    Date: 2013–03–01
  12. By: Christian Gouriéroux (CREST and University of Toronto); Jean-Michel Zakoian (CREST and University Lille 3)
    Abstract: The linear mixed causal and noncausal autoregressive processes provide often a better fit to economic and financial time series than the standard causal linear autoregressive processes. By considering the example of the noncausal Cauchy autoregressive process, we show that it might be explained by the special associated nonlinear causal dynamics. Indeed, this causal dynamics can include unit root, bubble phenomena, or asymmetric cycles often observed on financial markets. The noncausal Cauchy autoregressive process provides a new modelling for explosive multiple bubbles and their transmission in a multivariate dynamic framework. We also explain why standard unit root tests will fail in detecting such explosive bubbles
    Keywords: Causal Innovation, Explosive Bubble, Noncausal Process, Unit Root, Bubble Cointegration
    Date: 2013–02

This nep-ecm issue is ©2013 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.