nep-ecm New Economics Papers
on Econometrics
Issue of 2011‒04‒30
sixteen papers chosen by
Sune Karlsson
Orebro University

  1. Hierarchical Shrinkage Priors for Dynamic Regressions with Many Predictors By Dimitris Korobilis
  2. Quantile Regression with Censoring and Endogeneity By Victor Chernozhukov; Ivan Fernandez-Val; Amanda Kowalski
  3. The Variance Profile By Luati, Alessandra; Proietti, Tommaso; Reale, Marco
  4. Robustness of Bootstrap in Instrumental Variable Regression By Lorenzo Camponovo; Taisuke Otsu
  5. Breakdown Point Theory for Implied Probability Bootstrap By Lorenzo Camponovo; Taisuke Otsu
  6. Fixed effects models, random effects models, mixed models or multilevel models: properties and implementation of modeling of the heterogeneity in presence of clustered data By L. DAVEZIES
  7. Bayesian Factor Selection in Dynamic Term Structure Models By Márcio Laurini
  8. Local Identification of Nonparametric and Semiparametric Models By Xiaohong Chen; Victor Chernozhukov; Sokbae Lee; Whitney Newey
  9. A sharp analysis on the asymptotic behavior of the Durbin-Watson statistic for the first-order autoregressive process By Bernard Bercu; Frederic Proia
  10. Censored Demand System Estimation with Endogenous Expenditures in clustered samples: an application to food demand in urban Mozambique By Mikkel Barslund; ;
  11. Cointegration test with stationary covariates and the CDS-bond basis during the financial crisis By Jason J. Wu; Aaron L. Game
  12. Dynamic factor value-at-risk for large, heteroskedastic portfolios By Sirio Aramonte; Marius del Giudice Rodriguez; Jason J. Wu
  13. The Riskiness of Risk Models By Christophe Boucher; Bertrand Maillet
  14. Multivariate VaRs for Operational Risk Capital Computation : a Vine Structure Approach By Dominique Guegan; Bertrand Hassani
  15. Robust FDI Determinants: Bayesian Model Averaging In The Presence Of Selection Bias By Theo S Eicher; Lindy Helfman; Alex Lenkoski
  16. Beyond the DSGE Straitjacket By Pesaran, Hashem; Smith, Ron P.

  1. By: Dimitris Korobilis (Université Catholique de Louvain; The Rimini Centre for Economic Analysis (RCEA))
    Abstract: This paper builds on a simple unified representation of shrinkage Bayes estimators based on hierarchical Normal-Gamma priors. Various popular penalized least squares estimators for shrinkage and selection in regression models can be recovered using this single hierarchical Bayes formulation. Using 129 U.S. macroeconomic quarterly variables for the period 1959 – 2010 I exhaustively evaluate the forecasting properties of Bayesian shrinkage in regressions with many predictors. Results show that for particular data series hierarchical shrinkage dominates factor model forecasts, and hence it becomes a valuable addition to existing methods for handling large dimensional data.
    Keywords: Forecasting; shrinkage; factor model; variable selection; Bayesian LASSO
    JEL: C11 C22 C52 C53 C63 E37
    Date: 2011–04
    URL: http://d.repec.org/n?u=RePEc:rim:rimwps:21_11&r=ecm
  2. By: Victor Chernozhukov (Dept. of Economics, MIT); Ivan Fernandez-Val (Dept. of Economics, Boston University); Amanda Kowalski (Cowles Foundation, Yale University)
    Abstract: In this paper, we develop a new censored quantile instrumental variable (CQIV) estimator and describe its properties and computation. The CQIV estimator combines Powell (1986) censored quantile regression (CQR) to deal semiparametrically with censoring, with a control variable approach to incorporate endogenous regressors. The CQIV estimator is obtained in two stages that are nonadditive in the unobservables. The first stage estimates a nonadditive model with infinite dimensional parameters for the control variable, such as a quantile or distribution regression model. The second stage estimates a nonadditive censored quantile regression model for the response variable of interest, including the estimated control variable to deal with endogeneity. For computation, we extend the algorithm for CQR developed by Chernozhukov and Hong (2002) to incorporate the estimation of the control variable. We give generic regularity conditions for asymptotic normality of the CQIV estimator and for the validity of resampling methods to approximate its asymptotic distribution. We verify these conditions for quantile and distribution regression estimation of the control variable. We illustrate the computation and applicability of the CQIV estimator with numerical examples and an empirical application on estimation of Engel curves for alcohol.
    Keywords: Censored, Quantile, Instrumental variable, Censoring, Endogeneity, Engel curve, Alcohol
    JEL: C01 C14
    Date: 2011–04
    URL: http://d.repec.org/n?u=RePEc:cwl:cwldpp:1797&r=ecm
  3. By: Luati, Alessandra; Proietti, Tommaso; Reale, Marco
    Abstract: The variance profile is defined as the power mean of the spectral density function of a stationary stochastic process. It is a continuous and non-decreasing function of the power parameter, p, which returns the minimum of the spectrum (p → −∞), the interpolation error variance (harmonic mean, p = −1), the prediction error variance (geometric mean, p = 0), the unconditional variance (arithmetic mean, p = 1) and the maximum of the spectrum (p → ∞). The variance profile provides a useful characterisation of a stochastic processes; we focus in particular on the class of fractionally integrated processes. Moreover, it enables a direct and immediate derivation of the Szego-Kolmogorov formula and the interpolation error variance formula. The paper proposes a non-parametric estimator of the variance profile based on the power mean of the smoothed sample spectrum, and proves its consistency and its asymptotic normality. From the empirical standpoint, we propose and illustrate the use of the variance profile for estimating the long memory parameter in climatological and financial time series and for assessing structural change.
    Keywords: Predictability; Interpolation; Non-parametric spectral estimation; Long memory.
    JEL: C13 C14 C22
    Date: 2011–04–19
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:30378&r=ecm
  4. By: Lorenzo Camponovo (Dept. of Economics, University of Lugano); Taisuke Otsu (Cowles Foundation, Yale University)
    Abstract: This paper studies robustness of bootstrap inference methods for instrumental variable regression models. In particular, we compare the uniform weight and implied probability bootstrap approximations for parameter hypothesis test statistics by applying the breakdown point theory, which focuses on behaviors of the bootstrap quantiles when outliers take arbitrarily large values. The implied probabilities are derived from an information theoretic projection from the empirical distribution to a set of distributions satisfying orthogonality conditions for instruments. Our breakdown point analysis considers separately the effects of outliers in dependent variables, endogenous regressors, and instruments, and clarifies the situations where the implied probability bootstrap can be more robust than the uniform weight bootstrap against outliers. Effects of tail trimming introduced by Hill and Renault (2010) are also analyzed. Several simulation studies illustrate our theoretical findings.
    Keywords: Bootstrap, Breakdown point, Instrumental variable regression
    JEL: C12 C21 C31
    Date: 2011–04
    URL: http://d.repec.org/n?u=RePEc:cwl:cwldpp:1796&r=ecm
  5. By: Lorenzo Camponovo (Department of Economics, University of Lugano); Taisuke Otsu (Cowles Foundation, Yale University)
    Abstract: This paper studies robustness of bootstrap inference methods under moment conditions. In particular, we compare the uniform weight and implied probability bootstraps by analyzing behaviors of the bootstrap quantiles when outliers take arbitrarily large values, and derive the breakdown points for those bootstrap quantiles. The breakdown point properties characterize the situation where the implied probability bootstrap is more robust than the uniform weight bootstrap against outliers. Simulation studies illustrate our theoretical findings.
    Keywords: Bootstrap, Breakdown point, GMM
    JEL: C12 C21 C31
    Date: 2011–04
    URL: http://d.repec.org/n?u=RePEc:cwl:cwldpp:1793&r=ecm
  6. By: L. DAVEZIES (Insee)
    Abstract: This document presents the different ways to model heterogeneity in case of clustering, such pupils achievement in classroom or schools. In the linear framework, it essentially discusses the pertinence of fixed or random effects assumptions depending upon the goal pursued and the empirical evidence displayed by data. Statistical assumptions of the different models are introduced and successively discussed, as well as the properties of the estimators that are derived. For each model, SAS code is provided. Hausman tests, and their use to choose models, are explained. Beyond the linear framework, binary models are presented in the last chapter.
    Keywords: fixed effect, random effect, Hausman test, multilevel model, mixed model
    JEL: C01 I21
    Date: 2011
    URL: http://d.repec.org/n?u=RePEc:crs:wpdeee:g2011-03&r=ecm
  7. By: Márcio Laurini (IBMEC Business School)
    Abstract: This paper discusses Bayesian procedures for factor selection in dynamic term structure models through simulation methods based on Markov Chain Monte Carlo. The number of factors, besides influencing the fitting and prediction of observed yields, is also relevant to features such as the imposition of no-arbitrage conditions. We present a methodology for selecting the best specification in the Nelson-Siegel class of models using Reversible Jump MCMC.
    Keywords: Dynamic Term Structure Models, Model Selection, Reversible Jump MCMC
    JEL: C11 C15 G12
    Date: 2011–04–18
    URL: http://d.repec.org/n?u=RePEc:ibr:dpaper:2011-02&r=ecm
  8. By: Xiaohong Chen (Cowles Foundation, Yale University); Victor Chernozhukov (Dept. of Economics, MIT); Sokbae Lee (Dept. of Economics, Seoul National University); Whitney Newey (Dept. of Economics, MIT)
    Abstract: In parametric models a sufficient condition for local identification is that the vector of moment conditions is differentiable at the true parameter with full rank derivative matrix. We show that there are corresponding sufficient conditions for nonparametric models. A nonparametric rank condition and differentiability of the moment conditions with respect to a certain norm imply local identification. It turns out these conditions are slightly stronger than needed and are hard to check, so we provide weaker and more primitive conditions. We extend the results to semiparametric models. We illustrate the sufficient conditions with endogenous quantile and single index examples. We also consider a semiparametric habit-based, consumption capital asset pricing model. There we find the rank condition is implied by an integral equation of the second kind having a one-dimensional null space.
    Keywords: Identification, Local identification, Nonparametric models, Asset pricing
    JEL: C12 C13 C23
    Date: 2011–04
    URL: http://d.repec.org/n?u=RePEc:cwl:cwldpp:1795&r=ecm
  9. By: Bernard Bercu; Frederic Proia
    Abstract: The purpose of this paper is to provide a sharp analysis on the asymptotic behavior of the Durbin-Watson statistic. We focus our attention on the first-order autoregressive process where the driven noise is also given by a first-order autoregressive process. We establish the almost sure convergence and the asymptotic normality for both the least squares estimator of the unknown parameter of the autoregressive process as well as for the serial correlation estimator associated to the driven noise. In addition, the almost sure rates of convergence of our estimates are also provided. It allows us to establish the almost sure convergence and the asymptotic normality for the Durbin-Watson statistic. Finally, we propose a new bilateral statistical test for residual autocorrelation.
    Date: 2011–04
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1104.3328&r=ecm
  10. By: Mikkel Barslund; ;
    Abstract: We address the issue of endogenous expenditures in the context of a censored demand system by an augmented regression approach estimated with a two-step estimator. An application to food demand by urban households in Mozambique shows that accounting for endogeneity is potentially important in obtaining reliable point estimates of price and, in particular, expenditure elasticities. Furthermore, a bootstrap approach to obtain confidence intervals when data are clustered - as is the case with most household surveys - is devised. Based on a Monte Carlo exercise we speculate that previous studies in failing to account for the clustered nature of the data overstate the precision with which elasticities are estimated.
    Keywords: Censored demand system, endogeneity, survey data, elasticities, Mozambique, food demand
    JEL: D12 O12
    Date: 2011
    URL: http://d.repec.org/n?u=RePEc:lic:licosd:28011&r=ecm
  11. By: Jason J. Wu; Aaron L. Game
    Abstract: This paper proposes a residual based cointegration test with improved power. Based on the idea of Hansen (1995) and Elliott & Jansson (2003) in the unit root testing case, stationary covariates are used to improve the power of the residual based Augmented Dickey Fuller (ADF) test. The asymptotic null distribution contains difficult to estimate nuisance parameters for which there is no obvious method of estimation, therefore we propose a bootstrap methodology to obtain test critical values. Local-to-unity asymptotics and Monte Carlo simulations are used to evaluate the power of the test in large and small samples, respectively. These exercises show that the addition of covariates increases power relative to the ADF and Johansen tests, and that the power depends on the long-run correlation between the covariates and the cointegration candidates. The new test is used to test for cointegration between Credit Default Swap (CDS) and corporate bond spreads for a panel of U.S. firms during the 2007-2009 financial crisis. The new test finds stronger evidence for cointegration between the two spreads for more firms, relative to ADF and Johansen tests.
    Date: 2011
    URL: http://d.repec.org/n?u=RePEc:fip:fedgfe:2011-18&r=ecm
  12. By: Sirio Aramonte; Marius del Giudice Rodriguez; Jason J. Wu
    Abstract: Trading portfolios at Financial institutions are typically driven by a large number of financial variables. These variables are often correlated with each other and exhibit by time-varying volatilities. We propose a computationally efficient Value-at-Risk (VaR) methodology based on Dynamic Factor Models (DFM) that can be applied to portfolios with time-varying weights, and that, unlike the popular Historical Simulation (HS) and Filtered Historical Simulation (FHS) methodologies, can handle time-varying volatilities and correlations for a large set of financial variables. We test the DFM-VaR on three stock portfolios that cover the 2007-2009 financial crisis, and find that it reduces the number and average size of back-testing breaches relative to HS-VaR and FHS-VaR. DFM-VaR also outperforms HS-VaR when applied risk measurement of individual stocks that are exposed to systematic risk.
    Date: 2011
    URL: http://d.repec.org/n?u=RePEc:fip:fedgfe:2011-19&r=ecm
  13. By: Christophe Boucher (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Panthéon-Sorbonne - Paris I, A.A.Advisors-QCG - ABN AMRO); Bertrand Maillet (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Panthéon-Sorbonne - Paris I, A.A.Advisors-QCG - ABN AMRO, EIF - Europlace Institute of Finance)
    Abstract: We provide an economic valuation of the riskiness of risk models by directly measuring the impact of model risks (specification and estimation risks) on VaR estimates. We find that integrating the model risk into the VaR computations implies a substantial minimum correction of the order of 10-40% of VaR levels. We also present results of a practical method - based on a backtesting framework - for incorporating the model risk into the VaR estimates.
    Keywords: Model risk, quantile estimation, VaR, Basel II validation test.
    Date: 2011–03
    URL: http://d.repec.org/n?u=RePEc:hal:cesptp:halshs-00587779&r=ecm
  14. By: Dominique Guegan (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Panthéon-Sorbonne - Paris I, EEP-PSE - Ecole d'Économie de Paris - Paris School of Economics - Ecole d'Économie de Paris); Bertrand Hassani (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Panthéon-Sorbonne - Paris I, BPCE - BPCE)
    Abstract: The Basel Advanced Measurement Approach requires financial institutions to compute capital requirements on internal data sets. In this paper we introduce a new methodology permitting capital requirements to be linked with operational risks. The data are arranged in a matrix of 56 cells. Constructing a vine architecture, which is a bivariate decomposition of a n-dimensional structure (n > 2), we present a novel approach to compute multivariate operational risk VaRs. We discuss multivariate results regarding the impact of the dependence structure on the one hand, and of LDF modeling on the other. Our method is simple to carry out, easy to interpret and complies with the new Basel Committee requirements.
    Keywords: Operational risks, vine copula, loss distribution function, nested structure, VaR.
    Date: 2011–04
    URL: http://d.repec.org/n?u=RePEc:hal:cesptp:halshs-00587706&r=ecm
  15. By: Theo S Eicher; Lindy Helfman; Alex Lenkoski
    Abstract: The literature on Foreign Direct Investment (FDI) determinants is remarkably diverse in terms of competing theories and empirical results. We utilize Bayesian Model Averaging (BMA) to resolve the model uncertainty that surrounds the validity of the competing FDI theories. Since the structure of existing FDI data is known to induce selection bias, we extend BMA theory to HeckitBMA to address model uncertainty in the presence of selection bias. We then show that more than half of the previously suggested FDI determinants are no longer robust and highlight theories that receive support from the data. In addition, our selection approach allows us to highlight that the determinants of margins of FDI (intensive and extensive) differ profoundly in the data, while FDI theories do not usually model this aspect explicitly.
    Date: 2011–04
    URL: http://d.repec.org/n?u=RePEc:udb:wpaper:uwec-2011-07&r=ecm
  16. By: Pesaran, Hashem (University of Cambridge); Smith, Ron P. (Birkbeck College, University of London)
    Abstract: Academic macroeconomics and the research department of central banks have come to be dominated by Dynamic, Stochastic, General Equilibrium (DSGE) models based on micro-foundations of optimising representative agents with rational expectations. We argue that the dominance of this particular sort of DSGE and the resistance of some in the profession to alternatives has become a straitjacket that restricts empirical and theoretical experimentation and inhibits innovation and that the profession should embrace a more flexible approach to macroeconometric modelling. We describe one possible approach.
    Keywords: macroeconometric models, DSGE, VARs, long run theory
    JEL: C1 E1
    Date: 2011–04
    URL: http://d.repec.org/n?u=RePEc:iza:izadps:dp5661&r=ecm

This nep-ecm issue is ©2011 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.