
on Econometrics 
By:  Korobilis, Dimitris 
Abstract:  This paper proposes two distinct contributions to econometric analysis of large information sets and structural instabilities. First, it treats a regression model with timevarying coefficients, stochastic volatility and exogenous predictors, as an equivalent highdimensional static regression problem with thousands of covariates. Inference in this specification proceeds using Bayesian hierarchical priors that shrink the highdimensional vector of coefficients either towards zero or timeinvariance. Second, it introduces the frameworks of factor graphs and message passing as a means of designing efficient Bayesian estimation algorithms. In particular, a Generalized Approximate Message Passing (GAMP) algorithm is derived that has low algorithmic complexity and is trivially parallelizable. The result is a comprehensive methodology that can be used to estimate timevarying parameter regressions with arbitrarily large number of exogenous predictors. In a forecasting exercise for U.S. price inflation this methodology is shown to work very well. 
Keywords:  highdimensional inference; factor graph; Belief Propagation; Bayesian shrinkage; timevarying parameter model 
JEL:  C01 C11 C13 C52 C53 C61 E31 E37 
Date:  2019–09–15 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:96079&r=all 
By:  Jennifer Castle; Jurgen Doornik; David Hendry 
Abstract:  Jennifer L. Castle, Jurgen A. Doornik and David F. Hendry We investigate the role of the significance level when selecting models for forecasting as it controls both the null retention frequency and the probability of retaining relevant variables when using binary decisions to retain or drop variables. Analysis identifies the best selection significance level in a bivariate model when there are location shifts at or near the forecast origin. The tradeoff for selectÂ¬ing variables in forecasting models in a stationary world, namely that variables should be retained if their noncentralities exceed 1, applies in the widesense nonstationary settings with structural breaks examined here. The results confirm the optimality of the Akaike Information Criterion for forecasting in completely different settings than initially derived. An empirical illustration forecastÂ¬ing UK inflation demonstrates the applicability of the analytics. Simulation then explores the choice of selection significance level for 1step ahead forecasts in larger models when there are unknown loÂ¬cation shifts present under a range of alternative scenarios, using the multipath tree search algorithm, Autometrics (Doornik, 2009), varying the target significance level for the selection of regressors. The costs of model selection are shown to be small. The results provide support for model selection at looser than conventional settings, albeit with many additional features explaining the forecast perforÂ¬mance, with the caveat that retaining irrelevant variables that are subject to location shifts can worsen forecast performance. 
Keywords:  Model selection; forecasting; location shifts; significance level; Autometrics 
Date:  2018–11–09 
URL:  http://d.repec.org/n?u=RePEc:oxf:wpaper:861&r=all 
By:  Colella, Fabrizio (University of Lausanne); Lalive, Rafael (University of Lausanne); Sakalli, Seyhun Orcan (Université de Lausanne); Thoenig, Mathias (University of Lausanne) 
Abstract:  Analyses of spatial or network data are now very common. Yet statistical inference is challenging since unobserved heterogeneity can be correlated across neighboring observational units. We develop an estimator for the variancecovariance matrix (VCV) of OLS and 2SLS that allows for arbitrary dependence of the errors across observations in space or network structure, and across time periods. As a proof of concept, we conduct Monte Carlo simulations in a geospatial setting based on US Metropolitan areas; tests based on our estimator of the VCV asymptotically correctly reject the null hypothesis where conventional inference methods, e.g. those without clusters, or with clusters based on administrative units, reject the null hypothesis too often. We also provide simulations in a network setting based on the IDEAS structure of coauthorship and real life data on scientific performance; the Monte Carlo results again show that our estimator yields inference at the right significance level already in moderately sized samples, and it dominates other commonly used approaches to inference in networks. We provide guidance to the applied researcher with respect to (i) including or not potentially correlated regressors and (ii) choice of cluster bandwidth. Finally we provide a companion statistical package (acreg) enabling users to adjust OLS and 2SLS coefficient's standard errors, accounting for arbitrary dependence. 
Keywords:  geospatial data, arbitrary, clustering, network data, cluster, spatial correlation, instrumental variables 
JEL:  C13 C23 C26 
Date:  2019–08 
URL:  http://d.repec.org/n?u=RePEc:iza:izadps:dp12584&r=all 
By:  Millimet, Daniel L. (Southern Methodist University); Parmeter, Christopher F. (University of Miami) 
Abstract:  While classical measurement error in the dependent variable in a linear regression framework results only in a loss of precision, nonclassical measurement error can lead to estimates which are biased and inference which lacks power. Here, we consider a particular type of nonclassical measurement error: skewed errors. Unfortunately, skewed measurement error is likely to be a relatively common feature of many outcomes of interest in political science research. This study highlights the bias that can result even from relatively "small" amounts of skewed measurement error, particularly if the measurement error is heteroskedastic. We also assess potential solutions to this problem, focusing on the stochastic frontier model and nonlinear least squares. Simulations and three replications highlight the importance of thinking carefully about skewed measurement error, as well as appropriate solutions. 
Keywords:  nonlinear least squares, stochastic frontier, measurement error 
JEL:  C18 C51 
Date:  2019–08 
URL:  http://d.repec.org/n?u=RePEc:iza:izadps:dp12576&r=all 
By:  Vanessa Berenguer Rico; Bent Nielsen; Søren Johansen 
Abstract:  The Least Trimmed Squares (LTS) and Least Median of Squares (LMS) estimators are popular robust regression estimators. The idea behind the estimators is to find, for a given h; a subsample of h 'good' observations among n observations and estimate the regression on that subsample. We find models, based on the normal or the uniform distribution respectively, in which these estimators are maximum likelihood. We provide an asymptotic theory for the locationscale case in those models. The LTS estimator is found to be h 1/2 consistent and asymptotically standard normal. The LMS estimator is found to be h consistent and asymptotically Laplace. 
Keywords:  Chebychev estimator, LMS, Uniform distribution, Least squares estimator, LTS, Normal distribution, Regression, Robust statistics 
Date:  2019–09–04 
URL:  http://d.repec.org/n?u=RePEc:oxf:wpaper:879&r=all 
By:  Robert Calvert Jump (University of the West of England, Bristol) 
Abstract:  This paper demonstrates how sign restrictions can be used to infer the signs of certain historical shocks from reduced form VAR residuals. This is achieved without recourse to nonsign information. The method is illustrated by an application to the ADAS model using UK data. 
Keywords:  Structural VAR; sign restrictions. 
JEL:  C51 C52 
Date:  2018–01–02 
URL:  http://d.repec.org/n?u=RePEc:uwe:wpaper:20181802&r=all 
By:  Patrick Gagliardini (USI Università della Svizzera italiana; Swiss Finance Institute); Elisa Ossola (European Commission, Joint Research Centre); O. Scaillet (University of Geneva GSEM and GFRI; Swiss Finance Institute; University of Geneva  Research Center for Statistics) 
Abstract:  This chapter provides an econometric methodology for inference in largedimensional conditional factor models in finance. Changes in the business cycle and asset characteristics induce time variation in factor loadings and risk premia to be accounted for. The growing trend in the use of disaggregated data for individual securities motivates our focus on methodologies for a large number of assets. The beginning of the chapter outlines the concept of approximate factor structure in the presence of conditional information, and develops an arbitrage pricing theory for largedimensional factor models in this framework. Then we distinguish between two different cases for inference depending on whether factors are observable or not. We focus on diagnosing model specification, estimating conditional risk premia, and testing asset pricing restrictions under increasing crosssectional and time series dimensions. At the end of the chapter, we review some of the empirical findings and contrast analysis based on individual stocks and standard sets of portfolios. We also discuss the impact on computing timevarying cost of equity for a firm, and summarize differences between results for developed and emerging markets in an international setting. 
Keywords:  large panel, factor model, conditional information, risk premium, asset pricing, emerging markets 
JEL:  C12 C13 C23 C51 C52 G12 
Date:  2019–08 
URL:  http://d.repec.org/n?u=RePEc:chf:rpseri:rp1946&r=all 
By:  Junyue Wua; Yasumasa Matsuda 
Abstract:  This paper proposes a threshold extension of Spatial Dynamic Panel Data (SDPD) model with fixed twoway effect to analyze data sets with spatialtemporal heterogeneity. We classify multiple regimes by a threshold variable to examine the regional dependency of parameters in SDPD models. A Bayesian estimation method along with a maximum likelihood one is put forward and compared by their Monte Carlo performance results. We find out that our Bayesian method yields more preferable estimation results, though at the expense of computation time. We also illustrate empirical applications of the threshold SDPD model to two spatial panel data set, US cigar demand data from 1963 to 1992 and Japan foreign labour data from 2008 to 2014, showing that meaningful regional dependencies of SDPD model parameters were detected. 
Date:  2019–07 
URL:  http://d.repec.org/n?u=RePEc:toh:dssraa:98&r=all 
By:  Matthieu Garcin (Research Center  Léonard de Vinci Pôle Universitaire  De Vinci Research Center) 
Abstract:  The multifractional model with random exponent (MPRE) is one of the most recent fractional models which extend the fractional Brownian motion (fBm). This paper is an empirical contribution to the justification of the MPRE. Working with several FX rates between 2006 and 2016, sampled every minute, we show the statistical significance of various fractional models applied to logprices, from the fBm to the MPRE. We propose a method to extract realized Hurst exponents from logprices. This provides us with a series of Hurst exponents on which we can estimate different models of dynamics. In the MPRE framework, the data justify using a fractional model for the dynamic of the Hurst exponent. We estimate and interpret the value of the key parameter of this model of nested fractality, which is the Hurst exponent of the Hurst exponents. 
Keywords:  fractional Brownian motion,Hurst exponent,foreign exchange rate,multifractional Brownian motion,stable process,multifractional process with random exponent 
Date:  2019–09–11 
URL:  http://d.repec.org/n?u=RePEc:hal:wpaper:hal02283915&r=all 
By:  Francesco Sergi (University of the West of England, Bristol) 
Abstract:  This contribution to the history of the economic thought aims at describing how “Econometric Policy Evaluation: A Critique” (Lucas, 1976) has been interpreted through four decades of debates. This historical appraisal clarifies how Lucas’s argument is currently understood and discussed within the dynamic stochastic general equilibrium (DSGE) approach. The article illustrates how two opposite interpretations of the Lucas Critique arose in the early 1980s. On the one hand, a “theoretical interpretation” has been championed by the real business cycle (RBC) approach; on the other hand, an “empirical interpretation” has been advocated by Keynesians. Both interpretations can be understood as addressing a common question: Do microfoundations imply parameters’ stability? Following the RBC theoretical interpretation, microfoundations do imply stability; conversely, for Keynesians, parameters’ stability (or instability) should be supported by econometric evidence rather than theoretical considerations. Furthermore, the article argues that the DSGE approach represent a fragile compromise between these two opposite interpretations of Lucas (1976). This is especially true for the recent literature criticizing the DSGE models for being vulnerable to the Lucas Critique. 
Date:  2018–01–06 
URL:  http://d.repec.org/n?u=RePEc:uwe:wpaper:20181806&r=all 
By:  Damjana Kokol Bukov\v{s}ek; Toma\v{z} Ko\v{s}ir; Bla\v{z} Moj\v{s}kerc; Matja\v{z} Omladi\v{c} 
Abstract:  An investigation is presented of how a comprehensive choice of five most important measures of concordance (namely Spearman's rho, Kendall's tau, Spearman's footrule, Gini's gamma, and Blomqvist's beta) relate to nonexchangeability, i.e., asymmetry on copulas. Besides these results, the method proposed also seems to be new and may serve as a raw model for exploration of the relationship between a specific property of a copula and some of its measures of dependence structure, or perhaps the relationship between various measures of dependence structure themselves. In order to simplify the view on this method and provide a more conceptual interpretation a formulation is proposed borrowed from the imprecise probability setting which has been made possible for a standard probability method due to some recent discoveries. 
Date:  2019–09 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1909.06648&r=all 