nep-ecm New Economics Papers
on Econometrics
Issue of 2011‒07‒02
fourteen papers chosen by
Sune Karlsson
Orebro University

  1. Asymptotics for a Bayesian nonparametric estimator of species richness By Stefano Favaro; Antonio Lijoi; Igor Prunster
  2. Parametric inference and forecasting in continuously invertible volatility models By Wintenberger, Olivier; Cai, Sixiang
  3. Objective Bayes Factors for Gaussian Directed Acyclic Graphical Models By Guido Consonni; Luca La Rocca
  4. Dynamic Large Spatial Covariance Matrix Estimation in Application to Semiparametric Model Construction via Variable Clustering: the SCE approach By Song Song
  5. Large Vector Auto Regressions By Song Song; Peter J. Bickel
  6. On the Nonparametric Tests of Univariate GARCH Regression Models By Wasel Shadat
  7. Hierarchical shrinkage in time-varying parameter models By Miguel, Belmonte; Gary, Koop; Dimitris, Korobilis
  8. Objective Bayesian Search of Gaussian DAG Models with Non-local Priors By Davide Altomare; Guido Consonni; Luca La Rocca
  9. Testing for Clustering of Industries - Evidence from micro geographic data By Tobias Scholl; Thomas Brenner
  10. Forecasting Contemporaneous Aggregates with Stochastic Aggregation Weights By Ralf Brüggemann; Helmut Lütkepohl
  11. Geometric Allocation Approach for Transition Kernel of Markov Chain By Hidemaro Suwa; Synge Todo
  12. Analyzing Fixed-event Forecast Revisions By Philip Hans Franses; Chia-Lin Chang; Michael McAleer
  13. Measuring and testing for the systemically important financial institutions By Carlos Castro; Stijn Ferrari
  14. Pragmatism, Perspectival Realism, and Econometrics By Kevin D. Hoover

  1. By: Stefano Favaro (University of Turin and Collegio Carlo Alberto); Antonio Lijoi (Department of Economics and Quantitative Methods, University of Pavia, and Collegio Carlo Alberto); Igor Prunster (University of Turin and Collegio Carlo Alberto)
    Abstract: In Bayesian nonparametric inference, random discrete probability measures are commonly used as priors within hierarchical mixture models for density estimation and for inference on the clustering of the data. Recently it has been shown that they can also be exploited in species sampling problems: indeed they are natural tools for modeling the random proportions of species within a population thus allowing for inference on various quantities of statistical interest. For applications that involve large samples, the exact evaluation of the corresponding estimators becomes impracticable and, therefore, asymptotic approximations are sought. In the present paper we study the limiting behaviour of the number of new species to be observed from further sampling, conditional on observed data, assuming the observations are exchangeable and directed by a normalized generalized gamma process prior. Such an asymptotic study highlights a connection between the normalized generalized gamma process and the two–parameter Poisson–Dirichlet process that was previously known only in the unconditional case.
    Keywords: Bayesian Nonparametrics; Species sampling models; Asymptotics; s–diversity; Polynomially and exponentially tilted random variables; Completely random measures; Normalized generalized gamma process; Two parameter Poisson–Dirichlet process.
    Date: 2011–05
  2. By: Wintenberger, Olivier; Cai, Sixiang
    Abstract: We introduce the notion of continuously invertible volatility models that relies on some Lyapunov condition and some regularity condition. We show that it is almost equivalent to the volatilities forecasting efficiency of the parametric inference approach based on the Stochastic Recurrence Equation (SRE) given in Straumann (2005). Under very weak assumptions, we prove the strong consistency and the asymptotic normality of an estimator based on the SRE. From this parametric estimation, we deduce a natural forecast of the volatility that is strongly consistent. We successfully apply this approach to recover known results on univariate and multivariate GARCH type models where our estimator coincides with the QMLE. In the EGARCH(1,1)model, we apply this approach to find a strongly consistence forecast and to prove that our estimator is asymptotically normal when the limiting covariance matrix exists. Finally, we give some encouraging empirical results of our approach on simulations and real data.
    Keywords: Invertibility; volatility models; parametric estimation; strong consistency; asymptotic normality; asymmetric GARCH; exponential GARCH; stochastic recurrence equation; stationarity.
    JEL: C13 C32 C53 C01
    Date: 2011–06–20
  3. By: Guido Consonni (Department of Economics and Quantitative Methods, University of Pavia); Luca La Rocca (Dipartimento di Comunicazione e Economia, University of Modena and Reggio Emilia)
    Abstract: We propose an objective Bayesian method for the comparison of all Gaussian directed acyclic graphical models defined on a given set of variables. The method, which is based on the notion of fractional Bayes factor, requires a single default (typically improper) prior on the space of unconstrained covariance matrices, together with a prior sample size hyper-parameter, which can be set to its minimal value. We show that our approach produces genuine Bayes factors. The implied prior on the concentration matrix of any complete graph is a data-dependent Wishart distribution, and this in turn guarantees that Markov equivalent graphs are scored with the same marginal likelihood. We specialize our results to the smaller class of Gaussian decomposable undirected graphical models, and show that in this case they coincide with those recently obtained using limiting versions of hyper-inverse Wishart distributions as priors on the graph-constrained covariance matrices.
    Keywords: Bayes factor; Bayesian model selection; Directed acyclic graph; Exponential family; Fractional Bayes factor; Gaussian graphical model; Objective Bayes;Standard conjugate prior; Structural learning. network; Stochastic search; Structural learning.
    Date: 2011–03
  4. By: Song Song
    Abstract: To better understand the spatial structure of large panels of economic and financial time series and provide a guideline for constructing semiparametric models, this paper first considers estimating a large spatial covariance matrix of the generalized $m$-dependent and $\beta$-mixing time series (with $J$ variables and $T$ observations) by hard thresholding regularization as long as ${{\log J \, \cx^*(\ct)}}/{T} = \Co(1)$ (the former scheme with some time dependence measure $\cx^*(\ct)$) or $\log J /{T} = \Co(1)$ (the latter scheme with some upper bounded mixing coefficient). We quantify the interplay between the estimators' consistency rate and the time dependence level, discuss an intuitive resampling scheme for threshold selection, and also prove a general cross-validation result justifying this. Given a consistently estimated covariance (correlation) matrix, by utilizing its natural links with graphical models and semiparametrics, after "screening" the (explanatory) variables, we implement a novel forward (and backward) label permutation procedure to cluster the "relevant" variables and construct the corresponding semiparametric model, which is further estimated by the groupwise dimension reduction method with sign constraints. We call this the SCE (screen - cluster - estimate) approach for modeling high dimensional data with complex spatial structure. Finally we apply this method to study the spatial structure of large panels of economic and financial time series and find the proper semiparametric structure for estimating the consumer price index (CPI) to illustrate its superiority over the linear models.
    Date: 2011–06
  5. By: Song Song; Peter J. Bickel
    Abstract: One popular approach for nonstructural economic and financial forecasting is to include a large number of economic and financial variables, which has been shown to lead to significant improvements for forecasting, for example, by the dynamic factor models. A challenging issue is to determine which variables and (their) lags are relevant, especially when there is a mixture of serial correlation (temporal dynamics), high dimensional (spatial) dependence structure and moderate sample size (relative to dimensionality and lags). To this end, an \textit{integrated} solution that addresses these three challenges simultaneously is appealing. We study the large vector auto regressions here with three types of estimates. We treat each variable's own lags different from other variables' lags, distinguish various lags over time, and is able to select the variables and lags simultaneously. We first show the consequences of using Lasso type estimate directly for time series without considering the temporal dependence. In contrast, our proposed method can still produce an estimate as efficient as an \textit{oracle} under such scenarios. The tuning parameters are chosen via a data driven "rolling scheme" method to optimize the forecasting performance. A macroeconomic and financial forecasting problem is considered to illustrate its superiority over existing estimators.
    Date: 2011–06
  6. By: Wasel Shadat
    Date: 2011
  7. By: Miguel, Belmonte; Gary, Koop; Dimitris, Korobilis
    Abstract: In this paper, we forecast EU-area inflation with many predictors using time-varying parameter models. The facts that time-varying parameter models are parameter-rich and the time span of our data is relatively short motivate a desire for shrinkage. In constant coefficient regression models, the Bayesian Lasso is gaining increasing popularity as an effective tool for achieving such shrinkage. In this paper, we develop econometric methods for using the Bayesian Lasso with time-varying parameter models. Our approach allows for the coefficient on each predictor to be: i) time varying, ii) constant over time or iii) shrunk to zero. The econometric methodology decides automatically which category each coefficient belongs in. Our empirical results indicate the benefits of such an approach.
    Keywords: Forecasting; hierarchical prior; time-varying parameters; Bayesian Lasso
    JEL: C52 E37 C11 E47
    Date: 2011–06–20
  8. By: Davide Altomare (Department of Mathematics, University of Pavia); Guido Consonni (Department of Economics and Quantitative Methods, University of Pavia); Luca La Rocca (Dipartimento di Comunicazione e Economia, University of Modena and Reggio Emilia)
    Abstract: Directed Acyclic Graphical (DAG) models are increasingly employed in the study of physical and biological systems, where directed edges between vertices are used to model direct influences between variables. Identifying the graph from data is a challenging endeavor, which can be more reasonably tackled if the variables are assumed to satisfy a given ordering; in this case, we simply have to estimate the presence or absence of each possible edge, whose direction is established by the ordering of the variables. We propose an objective Bayesian methodology for model search over the space of Gaussian DAG models, which only requires default non-local priors as inputs. Priors of this kind are especially suited to learn sparse graphs, because they allow a faster learning rate, relative to ordinary local priors, when the true unknown sampling distribution belongs to a simple model. We implement an efficient stochastic search algorithm, which deals effectively with data sets having sample size smaller than the number of variables. We apply our method to a variety of simulated and real data sets.
    Keywords: Fractional Bayes factor; High-dimensional sparse graph; Moment prior; Non-local prior; Objective Bayes; Pathway based prior; Regulatory network; Stochastic search; Structural learning.
    Date: 2011–03
  9. By: Tobias Scholl (EBS European Business School); Thomas Brenner (Department of Geography, Philipps University Marburg)
    Abstract: We present a new statistical method that describes the localization patterns of industries in a continuous space. The proposed method does not divide space into subunits whereby it is not affected by the Modifiable Areal Unit Problem (MAUP). Our method fulfils all five criteria for a spatial statistical test of localization proposed by Duranton and Overman (2005) and improves them with respect to the significance of its results. Additionally, our test allows inference to the localization of highly clustered firms. Furthermore, the algorithm is efficient in its computation, which eases the usage in research.
    Keywords: Spatial concenctration, localization, clusters, MAUP, distance-based measures
    JEL: C40 C60 R12
    Date: 2011–06
  10. By: Ralf Brüggemann (Department of Economics, University of Konstanz, Germany); Helmut Lütkepohl (Department of Economics, European University Institute, Italy)
    Abstract: Many contemporaneously aggregated variables have stochastic aggregation weights. We compare different forecasts for such variables including univariate forecasts of the aggregate, a multivariate forecast of the aggregate that uses information from the disaggregate components, a forecast which aggregates a multivariate forecast of the disaggregate components and the aggregation weights, and a forecast which aggregates univariate forecasts for individual disaggregate components and the aggregation weights. In empirical illustrations based on aggregate GDP and money growth rates, we find forecast efficiency gains from using the information in the stochastic aggregation weights. A Monte Carlo study confirms that using the information on stochastic aggregation weights explicitly may result in forecast mean squared error reductions.
    Keywords: Aggregation, autoregressive process, mean squared error
    JEL: C32
    Date: 2011–04–21
  11. By: Hidemaro Suwa; Synge Todo
    Abstract: We introduce a new geometric approach that constructs a transition kernel of Markov chain. Our method always minimizes the average rejection rate and even reduce it to zero in many relevant cases, which cannot be achieved by conventional methods, such as the Metropolis-Hastings algorithm or the heat bath algorithm (Gibbs sampler). Moreover, the geometric approach makes it possible to find not only a reversible but also an irreversible solution of rejection-free transition probabilities. This is the first versatile method that can construct an irreversible transition kernel in general cases. We demonstrate that the autocorrelation time (asymptotic variance) of the Potts model becomes more than 6 times as short as that by the conventional Metropolis-Hastings algorithm. Our algorithms are applicable to almost all kinds of Markov chain Monte Carlo methods and will improve the efficiency.
    Date: 2011–06
  12. By: Philip Hans Franses; Chia-Lin Chang; Michael McAleer (University of Canterbury)
    Abstract: It is common practice to evaluate fixed-event forecast revisions in macroeconomics by regressing current revisions on one-period lagged revisions. Under weak-form efficiency, the correlation between the current and one-period lagged revisions should be zero. The empirical findings in the literature suggest that the null hypothesis of zero correlation between the current and one-period lagged revisions is rejected quite frequently, where the correlation can be either positive or negative. In this paper we propose a methodology to be able to interpret such non-zero correlations in a straightforward manner. Our approach is based on the assumption that forecasts can be decomposed into both an econometric model and expert intuition. The interpretation of the sign of the correlation between the current and one-period lagged revisions depends on the process governing intuition, and the correlation between intuition and news.
    Keywords: Evaluating forecasts; Macroeconomic forecasting; Rationality; Intuition; Weak-form efficiency; Fixed-event forecasts
    JEL: C22 C53 E27 E37
    Date: 2011–06–01
  13. By: Carlos Castro; Stijn Ferrari
    Abstract: This paper analyzes the measure of systemic importance ΔCoVaR proposed by Adrian and Brunnermeier (2009, 2010) within the context of a similar class of risk measures used in the risk management literature. Inaddition, we develop a series of testing procedures, based on ΔCoVaR, toidentify and rank the systemically important institutions. We stress the importance of statistical testing in interpreting the measure of systemic importance. An empirical application illustrates the testing procedures, using equity data for three European banks.
    Date: 2011–06–21
  14. By: Kevin D. Hoover
    Abstract: Econometricians tend to hold simultaneously two views in tension with each other: an apparent anti-realism that holds that all models are false and at best useful constructs or approximations to true models and an apparent realism that models are to be judged by their success at capturing an independent reality. This tension is resolved starting from Ronald Giere’s perspectival realism. Perspectival realism can itself be seen as a species of pragmatism, as that term is understood by its originator, Charles S. Peirce.
    Keywords: econometrics, realism, perspectival realism, perspectivism, pragmatism, models, Ronald Giere, Charles S. Peirce
    JEL: B40 B41 C10
    Date: 2010

This nep-ecm issue is ©2011 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.