nep-ecm New Economics Papers
on Econometrics
Issue of 2013‒11‒09
fifteen papers chosen by
Sune Karlsson
Orebro University

  1. Adaptive quadrature for likelihood inference on dynamic latent variable models for time-series and panel data By Cagnone, Silvia; Bartolucci, Francesco
  2. Consistent estimation of the Value-at-Risk when the error distribution of the volatility model is misspecified By El Ghourabi, Mohamed; Francq, Christian; Telmoudi, Fedya
  3. New Goodness-of-fit Diagnostics for Conditional Discrete Response Models By Igor Kheifets; Carlos Velasco
  4. Optimal Uniform Convergence Rates for Sieve Nonparametric Instrumental Variables Regression By Xiaohong Chen; Timothy Christensen
  5. Tie the straps: uniform bootstrap confidence bands for bounded influence curve estimators By Wolfgang Karl Härdle; Ya'acov Ritov; Weining Wang;
  6. Varying coefficient models as Mixed Models : reparametrization methods and bayesian estimation By Anna Freni Sterrantino
  7. The Tempered Ordered Probit (TOP) Model with an Application to Monetary Policy By Greene, William H.; Gillman, Max; Harris, Mark N.; Spencer, Christopher
  8. Contributions to the Theory of Optimal Tests By Moreira, Humberto; Moreira, Marcelo J.
  9. Detrending moving-average cross-correlation coefficient: Measuring cross-correlations between non-stationary series By Ladislav Kristoufek
  10. Risk Measure Inference By Christophe Hurlin; Sebastien Laurent; Rogier Quaedvlieg; Stephan Smeekes
  11. Misspecification-robust inference in linear asset pricing models with irrelevant risk factors By Nikolay Gospodinov; Raymond Kan; Cesare Robotti
  12. Bayesian estimation of a DSGE model with asset prices By Kliem, Martin; Uhlig, Harald
  13. On Multivariate Extensions of Conditional-Tail-Expectation By Areski Cousin; Elena Di Bernardinoy
  14. On time scaling of semivariance in a jump-diffusion process By Rodrigue Oeuvray; Pascal Junod
  15. Measurement Error and Policy Evaluation in the Frequency Domain By Xiangrong Yu

  1. By: Cagnone, Silvia; Bartolucci, Francesco
    Abstract: Maximum likelihood estimation of dynamic latent variable models requires to solve integrals that are not analytically tractable. Numerical approximations represent a possible solution to this problem. We propose to use the Adaptive Gaussian-Hermite (AGH) numerical quadrature approximation for a class of dynamic latent variable models for time-series and panel data. These models are based on continuous time-varying latent variables which follow an autoregressive process of order 1, AR(1). Two examples of such models are the stochastic volatility models for the analysis of financial time-series and the limited dependent variable models for the analysis of panel data. A comparison between the performance of AGH methods and alternative approximation methods proposed in the literature is carried out by simulation. Examples on real data are also used to illustrate the proposed approach.
    Keywords: AR(1); categorical longitudinal data; Gaussian-Hermite quadrature; limited dependent variable models; stochastic volatility model
    JEL: C13 C32 C33
    Date: 2013–10–29
  2. By: El Ghourabi, Mohamed; Francq, Christian; Telmoudi, Fedya
    Abstract: A two-step approach for conditional Value at Risk (VaR) estimation is considered. In the first step, a generalized-quasi-maximum likelihood estimator (gQMLE) is employed to estimate the volatility parameter, and in the second step the empirical quantile of the residuals serves to estimate the theoretical quantile of the innovations. When the instrumental density $h$ of the gQMLE is not the Gaussian density utilized in the standard QMLE, or is not the true distribution of the innovations, both the estimations of the volatility and of the quantile are asymptotically biased. The two errors however counterbalance each other, and we finally obtain a consistent estimator of the conditional VaR. For a wide class of GARCH models, we derive the asymptotic distribution of the VaR estimation based on gQMLE. We show that the optimal instrumental density $h$ depends neither on the GARCH parameter nor on the risk level, but only on the distribution of the innovations. A simple adaptive method based on empirical moments of the residuals makes it possible to infer an optimal element within a class of potential instrumental densities. Important asymptotic efficiency gains are achieved by using gQMLE instead of the usual Gaussian QML when the innovations are heavy-tailed. We extended our approach to Distortion Risk Measure parameter estimation, where consistency of the gQMLE-based method is also proved. Numerical illustrations are provided, through simulation experiments and an application to financial stock indexes.
    Keywords: APARCH, Conditional VaR, Distortion Risk Measures, GARCH, Generalized Quasi Maximum Likelihood Estimation, Instrumental density.
    JEL: C22 C58
    Date: 2013–10
  3. By: Igor Kheifets (New Economic School, Moscow); Carlos Velasco (Dept. of Economics, Universidad Carlos III de Madrid)
    Abstract: This paper proposes new specification tests for conditional models with discrete responses. In particular, we can test the static and dynamic ordered choice model specifications, which is key to apply efficient maximum likelihood methods, to obtain consistent estimates of partial effects and to get appropriate predictions of the probability of future events. The traditional approach is based on probability integral transforms of a jittered discrete data which leads to continuous uniform iid series under the true conditional distribution. We investigate in this paper an alternative transformation based only on original discrete data. We show analytically and in simulations that our approach dominates the traditional approach in terms of power. We apply the new tests to models of the monetary policy conducted by the Federal Reserve.
    Keywords: Specification tests, Count data, Dynamic discrete choice models, Conditional probability integral transform
    JEL: C12 C22 C52
    Date: 2013–11
  4. By: Xiaohong Chen (Cowles Foundation, Yale University); Timothy Christensen (Dept. of Economics, Yale University)
    Abstract: We study the problem of nonparametric regression when the regressor is endogenous, which is an important nonparametric instrumental variables (NPIV) regression in econometrics and a difficult ill-posed inverse problem with unknown operator in statistics. We first establish a general upper bound on the sup-norm (uniform) convergence rate of a sieve estimator, allowing for endogenous regressors and weakly dependent data. This result leads to the optimal sup-norm convergence rates for spline and wavelet least squares regression estimators under weakly dependent data and heavy-tailed error terms. This upper bound also yields the sup-norm convergence rates for sieve NPIV estimators under i.i.d. data: the rates coincide with the known optimal L^2-norm rates for severely ill-posed problems, and are power of log(n) slower than the optimal L^2-norm rates for mildly ill-posed problems. We then establish the minimax risk lower bound in sup-norm loss, which coincides with our upper bounds on sup-norm rates for the spline and wavelet sieve NPIV estimators. This sup-norm rate optimality provides another justification for the wide application of sieve NPIV estimators. Useful results on weakly-dependent random matrices are also provided.
    Keywords: Nonparametric instrumental variables; Statistical ill-posed inverse problems; Optimal uniform convergence rates; Weak dependence; Random matrices; Splines; Wavelets
    JEL: C13 C14 C32
    Date: 2013–11
  5. By: Wolfgang Karl Härdle; Ya'acov Ritov; Weining Wang;
    Abstract: We consider theoretical bootstrap \coupling" techniques for nonparametric robust smoothers and quantile regression, and verify the bootstrap improvement. To cope with curse of dimensionality, a variant of \coupling" bootstrap techniques are developed for additive models with both symmetric error distributions and further extension to the quantile regression framework. Our bootstrap method can be used in many situations like constructing condence intervals and bands. We demonstrate the bootstrap improvement over the asymptotic band theoretically, and also in simulations and in applications to rm expenditures and the interaction of economic sectors and the stock market.
    Keywords: Nonparametric Regression, Bootstrap, Quantile Regression, Con- dence Bands, Additive Model, Robust Statistics
    JEL: C00 C14
    Date: 2013–10
  6. By: Anna Freni Sterrantino (Università di Bologna)
    Abstract: Non-linear relationships are accommodated in a regression model using smoothing functions. Interaction may occurs between continuous variable, in this case interaction between nonlinear and linear covariate leads to varying coefficent model (VCM), a subclass of generalized additive model. Additive models can be estimated as generalized linear mixed models, after being reparametrized. In this article we show three different type of matrix design for mixed model for VCM, by applying b-spline smoothing functions. An application on real data is provided and model estimates re computed with a Bayesian approach.
    Keywords: Varying Coefficient models, Generalized linear mixed models, reparametrization, B-spline Modelli a coefficienti variabili, Modelli linearu generaliazzati ad effetti misti, parametrizzazione, B-splinew
    Date: 2013
  7. By: Greene, William H.; Gillman, Max; Harris, Mark N.; Spencer, Christopher
    Abstract: We propose a Tempered Ordered Probit (TOP) model. Our contribution lies not only in explicitly accounting for an excessive number of observations in a given choice category - as is the case in the standard literature on inflated models; rather, we introduce a new econometric model which nests the recently developed Middle Inflated Ordered Probit (MIOP) models of Bagozzi and Mukherjee (2012) and Brooks, Harris, and Spencer (2012) as a special case, and further, can be used as a specification test of the MIOP, where the implicit test is described as being one of symmetry versus asymmetry. In our application, which exploits a panel data-set containing the votes of Bank of England Monetary Policy Committee (MPC) members, we show that the TOP model affords the econometrician considerable flexibility with respect to modeling the impact of different forms of uncertainty on interest rate decisions. Our findings, we argue, reveal MPC members. asymmetric attitudes towards uncertainty and the changeability of interest rates.
    Keywords: Monetary policy committee, voting, discrete data, uncertainty, tempered equations
    JEL: C3 E50
    Date: 2013–09
  8. By: Moreira, Humberto; Moreira, Marcelo J.
    Abstract: This paper considers tests which maximize the weighted average power(WAP). The focus is on determining WAP tests subject to an uncountablenumber of equalities and/or inequalities. The unifying theory allows us toobtain tests with correct size, similar tests, and unbiased tests, among others.A WAP test may be randomized and its characterization is not alwayspossible. We show how to approximate the power of the optimal test bysequences of nonrandomized tests. Two alternative approximations are considered.The rst approach considers a sequence of similar tests for an increasingnumber of boundary conditions. This discretization allows us toimplement the WAP tests in practice. The second method nds a sequenceof tests which approximate the WAP test uniformly. This approximationallows us to show that WAP similar tests are admissible.The theoretical framework is readily applicable to several econometricmodels, including the important class of the curved-exponential family. Inthis paper, we consider the instrumental variable model with heteroskedasticand autocorrelated errors (HAC-IV) and the nearly integrated regressormodel. In both models, we nd WAP similar and (locally) unbiased testswhich dominate other available tests.
    Date: 2013–10–28
  9. By: Ladislav Kristoufek
    Abstract: In the paper, we introduce a new measure of correlation between possibly non-stationary series. As the measure is based on the detrending moving-average cross-correlation analysis (DMCA), we label it as the DMCA coefficient $\rho_{DMCA}(\lambda)$ with a moving average window length $\lambda$. We analytically show that the coefficient ranges between -1 and 1 as a standard correlation does. In the simulation study, we show that the values of $\rho_{DMCA}(\lambda)$ very well correspond to the true correlation between the analyzed series regardless the (non-)stationarity level. Dependence of the newly proposed measure on other parameters -- correlation level, moving average window length and time series length -- is discussed as well.
    Date: 2013–11
  10. By: Christophe Hurlin (LEO - Laboratoire d'économie d'Orleans - CNRS : UMR6221 - Université d'Orléans); Sebastien Laurent (IAE Aix-en-Provence - Institut d'Administration des Entreprises - Aix-en-Provence - Université Paul Cézanne - Aix-Marseille III, GREQAM - Groupement de Recherche en Économie Quantitative d'Aix-Marseille - Université de la Méditerranée - Aix-Marseille II - Université Paul Cézanne - Aix-Marseille III - École des Hautes Études en Sciences Sociales [EHESS] - CNRS : UMR7316); Rogier Quaedvlieg (Maastricht University - univ. Maastricht); Stephan Smeekes (Maastricht University - univ. Maastricht)
    Abstract: We propose a widely applicable bootstrap based test of the null hypothesis of equality of two firms' Risk Measures (RMs) at a single point in time. The test can be applied to any market-based measure. In an iterative procedure, we can identify a complete grouped ranking of the RMs, with particular application to finding buckets of fi rms of equal systemic risk. An extensive Monte Carlo Simulation shows desirable properties. We provide an application on a sample of 94 U.S. financial institutions using the ΔCoVaR, MES and %SRISK, and conclude only the %SRISK can be estimated with enough precision to allow for a meaningful ranking.
    Keywords: Bootstrap; Grouped Ranking; Risk Measures; Uncertainty
    Date: 2013–10–28
  11. By: Nikolay Gospodinov; Raymond Kan; Cesare Robotti
    Abstract: We show that in misspecified models with useless factors (for example, factors that are independent of the returns on the test assets), the standard inference procedures tend to erroneously conclude, with high probability, that these irrelevant factors are priced and the restrictions of the model hold. Our proposed model selection procedure, which is robust to useless factors and potential model misspecification, restores the standard inference and proves to be effective in eliminating factors that do not improve the model's pricing ability. The practical relevance of our analysis is illustrated using simulations and empirical applications.
    Date: 2013
  12. By: Kliem, Martin; Uhlig, Harald
    Abstract: This paper presents a novel Bayesian method for estimating dynamic stochastic general equilibrium (DSGE) models subject to a constrained posterior distribution of the implied Sharpe ratio. We apply our methodology to a DSGE model with habit formation in consumption and leisure, using an estimate of the Sharpe ratio to construct the constraint. We show that the constrained estimation produces a quantitative model with both reasonable asset-pricing as well as business-cycle implications. --
    Keywords: Bayesian estimation,stochastic steady-state,prior choice,Sharpe ratio
    JEL: C11 E32 E44 G12
    Date: 2013
  13. By: Areski Cousin (SAF - Laboratoire de Sciences Actuarielle et Financière - Université Claude Bernard - Lyon I : EA2429); Elena Di Bernardinoy (IMATH - Département Ingénierie Mathématique - Conservatoire National des Arts et Métiers (CNAM))
    Abstract: In this paper, we introduce two alternative extensions of the classical univariate Conditional-Tail-Expectation (CTE) in a multivariate setting. Contrary to allocation measures or systemic risk measures, these measures are also suitable for multivariate risk problems where risks are heterogenous in nature and cannot be aggregated together.
    Keywords: Multivariate risk measures, Level sets of distribution functions, Multivariate probability integral transformation, Stochastic orders, Copulas and dependence.
    Date: 2013–10–28
  14. By: Rodrigue Oeuvray; Pascal Junod
    Abstract: The aim of this paper is to examine the time scaling of the semivariance when returns are modeled by various types of jump-diffusion processes, including stochastic volatility models with jumps in returns and in volatility. In particular, we derive an exact formula for the semivariance when the volatility is kept constant, explaining how it should be scaled when considering a lower frequency. We also provide and justify the use of a generalization of the Ball-Torous approximation of a jump-diffusion process, this new model appearing to deliver a more accurate estimation of the downside risk. We use Markov Chain Monte Carlo (MCMC) methods to fit our stochastic volatility model. For the tests, we apply our methodology to a highly skewed set of returns based on the Barclays US High Yield Index, where we compare different time scalings for the semivariance. Our work shows that the square root of the time horizon seems to be a poor approximation in the context of semivariance and that our methodology based on jump-diffusion processes gives much better results.
    Date: 2013–11
  15. By: Xiangrong Yu (Hong Kong Institute for Monetary Research)
    Abstract: This paper explores frequency-specific implications of measurement error for the design of stabilization policy rules. Policy evaluation in the frequency domain is interesting because the characterization of policy effects frequency by frequency gives the policymaker additional information about the effects of a given policy. Further, some important aspects of policy analysis can be better understood in the frequency domain than in the time domain. In this paper, I develop a rich set of design limits that describe fundamental restrictions on how a policymaker can alter variance at different frequencies. I also examine the interaction of measurement error and model uncertainty to understand the effects of different sources of informational limit on optimal policymaking. In a linear feedback model with noisy state observations, measurement error seriously distorts the performance of the policy rule that is optimal for the noise-free system. Adjusting the policy to appropriately account for measurement error means that the policymaker becomes less responsive to the raw data. For a parameterized example which corresponds to the choice of monetary policy rules in a simple AR (1) environment, I show that an additive white noise process of measurement error has little impact at low frequencies but induces less active control at high frequencies, and even may lead to more aggressive policy actions at medium frequencies. Local robustness analysis indicates that measurement error reduces the policymaker's reaction to model uncertainty, especially at medium and high frequencies.
    Keywords: Policy Evaluation, Measurement Error, Spectral Analysis, Design Limits, Model Uncertainty, Monetary Policy Rules
    JEL: C52 E52 E58
    Date: 2013–10

This nep-ecm issue is ©2013 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.