|
on Econometrics |
By: | Tom Boot (University of Groningen, The Netherlands); Didier Nibbering (Erasmus University Rotterdam, The Netherlands) |
Abstract: | In modern data sets, the number of available variables can greatly exceed the number of observations. In this paper we show how valid confidence intervals can be constructed by approximating the inverse covariance matrix by a scaled Moore-Penrose pseudoinverse, and using the lasso to perform a bias correction. In addition, we propose random least squares, a new regularization technique which yields narrower confidence intervals with the same theoretical validity. Random least squares estimates the inverse covariance matrix using multiple low-dimensional random projections of the data. This is shown to be equivalent to a generalized form of ridge regularization. The methods are illustrated in Monte Carlo experiments and an empirical example using quarterly data from the FRED-QD database, where gross domestic product is explained by a large number of macroeconomic and financial indicators. |
Keywords: | high-dimensional regression; confidence intervals; random projection; Moore-Penrose pseudoinverse |
JEL: | C12 C13 O40 |
Date: | 2017–03–14 |
URL: | http://d.repec.org/n?u=RePEc:tin:wpaper:20170032&r=ecm |
By: | Juan Carlos Escanciano (Indiana University) |
Abstract: | This paper investigates estimation of linear regression models with strictly exoge- nous instruments under minimal identifying assumptions. The paper introduces a uniformly (in the data generating process) consistent estimator under nearly minimal identifying assumptions. The proposed estimator, called the Integrated Instrumental Variables (IIV) estimator, is a simple weighted least squares estimator and does not require the choice of a bandwidth or tuning parameter, or the selection of a fi nite set of instruments. Thus, the estimator is extremely simple to implement. Monte Carlo evidence supports the theoretical claims and suggests that the IIV estimator is a robust complement to optimal IV in fi nite samples. In an application with quarterly UK data, IIV estimates a positive and signifi cant elasticity of intertemporal substitution and an equally sensible estimate for its reciprocal, in sharp contrast to IV methods that fail to identify these parameters. |
Keywords: | Uniform identifi cation; Instrumental variables; Weak instruments; Uni- form inference; Intertemporal elasticity of substitution |
Date: | 2016–11 |
URL: | http://d.repec.org/n?u=RePEc:inu:caeprp:2017001&r=ecm |
By: | Scott French (School of Economics, UNSW Business School, UNSW) |
Abstract: | I propose a method of moments estimator of revealed comparative advantage based on a flexible specification of trade flows that is consistent with a large class of gravity models of international trade. I show that this estimator has many desirable properties. It is theoretically consistent with the classical notion of Ricardian comparative advantage and is easily computed, even for very large samples. Statistical inference is straightforward, and it is closely related to a commonly-used estimator in the gravity literature that is known to be robust to various forms of heteroskedasticity and measurement error common to trade data. |
Keywords: | Method of moments; pseudo-maximum likelihood; Poisson; Ricardian; RCA; index |
JEL: | F10 F11 F14 C13 C21 C55 |
Date: | 2017–02 |
URL: | http://d.repec.org/n?u=RePEc:swe:wpaper:2017-05&r=ecm |
By: | Shovan Chowdhury (Indian Institute of Management Kozhikode); Amitava Mukherjee (Indian Institute of Management Udaipur); Asok K. Nanda (IISER Kolkata) |
Abstract: | Here we introduce two-parameter compounded geometric distributions with monotone failure rates. These distributions are derived by compounding geometric distribution and zero-truncated Poisson distribution. Some statistical and reliability properties of the distributions are investigated. Parameters of the proposed distributions are estimated by the maximum likelihood method as well as through the minimum distance method of estimation. Performance of the estimates by both the methods of estimation are compared based on Monte-Carlo simulations. An illustration with Air Crash casualties demonstrates that the distributions can be considered as a suitable model under several real situations. |
Keywords: | Compounding; Geometric Distribution; Hazard rate function; Maximum likelihood estimation; Method of minimum distance; Monte-Carlo; Zero-Truncated Poisson Distribution. |
URL: | http://d.repec.org/n?u=RePEc:iik:wpaper:144&r=ecm |
By: | Jinyuan Chang; Bin Guo; Qiwei Yao |
Abstract: | We consider a multivariate time series model which represents a high dimensional vector process as a sum of three terms: a linear regression of some observed regressors, a linear combination of some latent and serially correlated factors, and a vector white noise. We investigate the inference without imposing stationary conditions on the target multivariate time series, the regressors and the underlying factors. Furthermore we deal with the the endogeneity that there exist correlations between the observed regressors and the unobserved factors. We also consider the model with nonlinear regression term which can be approximated by a linear regression function with a large number of regressors. The convergence rates for the estimators of regression coefficients, the number of factors, factor loading space and factors are established under the settings when the dimension of time series and the number of regressors may both tend to infinity together with the sample size. The proposed method is illustrated with both simulated and real data examples. |
Keywords: | α-mixing; dimension reduction; instrument variables; nonstationarity; time series |
JEL: | C13 C32 C39 |
Date: | 2015–12–19 |
URL: | http://d.repec.org/n?u=RePEc:ehl:lserod:61886&r=ecm |
By: | Sander Barendse (Erasmus University Rotterdam, The Netherlands) |
Abstract: | We propose a semiparametric estimator to determine the effects of explanatory variables on the conditional interquantile expectation (IQE) of the random variable of interest, without specifying the conditional distribution of the underlying random variables. IQE is the expected value of the random variable of interest given that its realization lies in an interval between two quantiles, or in an interval that covers the range of the distribution to the left or right of a quantile. Our so-called interquantile expectation regression (IQER) estimator is based on the GMM framework. We derive consistency and the asymptotic distribution of the estimator, and provide a consistent estimator of the asymptotic covariance matrix. Our results apply to stationary and ergodic time series. In a simulation study we show that our asymptotic theory provides an accurate approximation in small samples. We provide an empirical illustration in finance, in which we use the IQER estimator to estimate one-step-ahead daily expected shortfall conditional on previously observed daily, weekly, and monthly aggregated realized measures. |
Keywords: | quantile; interquantile expectation; regression; generalized method of moments; risk management; expected shortfall |
JEL: | C13 C14 C32 C58 G32 |
Date: | 2017–03–20 |
URL: | http://d.repec.org/n?u=RePEc:tin:wpaper:20170034&r=ecm |
By: | Shovan Chowdhury (Indian Institute of Management Kozhikode) |
Abstract: | A unified approach is proposed in this paper to study a family of lifetime distributions of a system consisting of random number of components in series and in parallel. While the lifetimes of the components are assumed to follow generalized (exponentiated) Weibull distribution, a zero-truncated Poisson is assigned to model the random number of components in the system. The resulting family of compounded distributions describes several well-known distributions as well as some new models with some of their statistical and reliability properties. Various ageing classes of life distributions including increasing, decreasing, bath-tub, upside-down-bathtub and roller coaster shaped failure rates are covered by the family of compounded distributions. The simplest algorithm for maximum likelihood method of estimation of the model parameters is discussed. Some numerical results are obtained via Monte-Carlo Simulation. The asymptotic variance-covariance matrices of the estimators are also obtained. Five different real data sets are used to validate the distributions and the results demonstrate that the family of distributions can be considered as a suitable model under several real situations. |
Keywords: | Unified approach, Compounding, Generalized Weibull Distribution; Hazard Function; ML Estimation; Zero-Truncated Poisson Distribution. |
URL: | http://d.repec.org/n?u=RePEc:iik:wpaper:148&r=ecm |
By: | Sujay K Mukhoti (Indian Institute of Management Kozhikode) |
Abstract: | In this paper I present a new single factor model for assets return observed in discrete time and its latent volatility with a common “market factor”. This model attempts to unify the concept of feedback effect and skewness in return distribution. Further, it generalizes existing stochastic volatility model with constant feedback to a framework with time varying feedback. As an immediate consequence dynamic skewness and leverage effect follows. However, the dynamic structure violates weakstationarity assumption usually considered for the heteroskedastic models for returns and hence the concept of bounded stationarity is introduced to address the issue of nonstationarity. The single factor model also helps to reduce the number of parameters to be estimated compared to existing SV models with separate feedback and skewness parameters. A characterization of the error distributions for returns and volatility is provided on the basis of existence of conditional moments. Finally, an application of the model has been explained with Normal error and half Normal market factor distribution. |
URL: | http://d.repec.org/n?u=RePEc:iik:wpaper:145&r=ecm |
By: | Matthew T. Holt (University of Alabama, Department of Economics, Finance & Legal Studies); Timo Teräsvirta (Aarhus University and CREATES, C.A.S.E., Humboldt-Universität zu Berlin) |
Abstract: | This paper examines local changes in annual temperature data for the northern and southern hemispheres (1850-2014) by using a multivariate generalisation of the shifting-mean autoregressive model of González and Teräsvirta (2008). Univariate models are first fitted to each series by using the QuickShift methodology. Full information maximum likelihood estimates of a bivariate system of temperature equations are then obtained and asymptotic properties of the corresponding estimators considered. The system is then used to perform formal tests of co-movements, called co-shifting, in the series. The results show evidence of co-shifting in the two series. Forecasting this pair of series is considered as well. |
Keywords: | Co-breaking, Hemispheric temperatures, Vector nonlinear model, Testing linearity, Structural change |
JEL: | C22 C32 C52 C53 Q54 |
Date: | 2401 |
URL: | http://d.repec.org/n?u=RePEc:aah:create:2017-05&r=ecm |
By: | Soumya Roy (Indian Institute of Management Kozhikode); Gijo E. V. (Indian Statistical Institute, Bangalore); Biswabrata Pradhan (Indian Statistical Institute, Kolkata) |
Abstract: | This article considers inference for the unknown parameters of log-normal distribution based on progressive Type-I interval censored data by both frequentist and Bayesian methods. The maximum likelihood estimates (MLE) are computed by using EM algorithm. The asymptotic standard errors (ASEs) of the MLEs are obtained. Various Bayes estimates of the unknown parameters are also computed. It is observed that the Bayes estimates cannot be obtained in explicit form. A Gibbs sampling scheme is developed by adopting a data augmentation method to compute the Bayes estimates and highest posterior density credible intervals. The performance of the MLEs and the Bayesian estimators is judged by a simulation study. A real data set is analyzed for the purpose of illustration. |
Keywords: | Bayesian D- and C-optimality criteria, Data Augmentation, EM algorithm, Missing Information Principle, Gibbs Sampling, Optimal design. |
Date: | 2016–01 |
URL: | http://d.repec.org/n?u=RePEc:iik:wpaper:191&r=ecm |
By: | Huber, Florian |
Abstract: | In this note we develop a Taylor rule based empirical exchange rate model for eleven major currencies that endogenously determines the number of structural breaks in the coefficients. Using a constant parameter specification and a standard time-varying parametermodel as competitors reveals that our flexible modeling framework yields more precise density forecasts for all major currencies under scrutiny over the last 24 years. |
Keywords: | Stochastic volatility, mixture innovation models, time-varying parameters |
Date: | 2017–03 |
URL: | http://d.repec.org/n?u=RePEc:wiw:wus005:5461&r=ecm |
By: | Søren Johansen (Department of Economics, University of Copenhagen); Morten Nyboe Tabor (Department of Economics, University of Copenhagen) |
Abstract: | In a linear state space model Y(t)=BT(t)+e(t), we investigate if the unobserved trend, T(t), cointegrates with the predicted trend, E(t), and with the estimated predicted trend, in the sense that the spreads are stationary. We find that this result holds for the spread B(T(t)-E(t)) and the estimated spread. For the spread between the trend and the estimated trend, T(t)-E(t), however, cointegration depends on the identification of B. The same results are found, if the observations Y(t), from the state space model are analysed using a cointegrated vector autoregressive model, where the trend is defined as the common trend. Finally, we investigate cointegration between the spread beteween trends and their estimators based on the two models, and find the same results. We illustrate with two examples and confirm the results by a small simulation study. |
Keywords: | Cointegration of trends, State space models, CVAR models |
JEL: | C32 |
Date: | 2017–03–13 |
URL: | http://d.repec.org/n?u=RePEc:kud:kuiedp:1702&r=ecm |
By: | Jimut Bahan Chakrabarty (Indian Institute of Management Kozhikode); Shovan Chowdhury (Indian Institute of Management Kozhikode) |
Abstract: | In this paper two probability distributions are introduced compounding inverse Weibull distribution with Poisson and geometric distributions. The distributions can be used to model lifetime of series system where the lifetimes follow inverse Weibull distribution and the subgroup size being random follows either geometric or Poisson distribution. Some of the important statistical and reliability properties of each of the distributions are derived. The distributions are found to exhibit both monotone and non-monotone failure rates. The parameters of the distributions are estimated using the maximum likelihood method and the expectation-maximization algorithm. The potentials of the distributions are explored through three real life data sets and are compared with similar compounded distributions, viz. Weibull-geometric, Weibull-Poisson, exponential-geometric and exponential-Poisson distributions. |
Keywords: | Inverse Weibull distribution, Poisson distribution, Geometric distribution, Hazard function, Maximum likelihood estimation, EM algorithm. |
Date: | 2016–12 |
URL: | http://d.repec.org/n?u=RePEc:iik:wpaper:213&r=ecm |
By: | Ellis Scharfenaker (Department of Economics, University of Missouri Kansas City); Duncan Foley (Department of Economics, New School for Social Research) |
Abstract: | Many problems in empirical economic analysis involve systems in which the quantal actions of a large number of participants determine the distribution of some social outcome. In many of these cases key model variables are un- observed. From the statistical perspective, when observed variables depend non-trivially on unobserved variables the joint distribution of the variables of interest is underdetermined and the model is ill-posed due to incomplete information. In this paper we examine the class of models de ned by a joint distribution of discrete individual actions and an outcome variable, where one of the variables is unobserved, so that the joint distribution is underdetermined. We derive a general maximum entropy based method to infer the underdetermined joint distribution in this class of models. We apply this method to the classical Smithian theory of competition where firms' profit rates are observed but the entry and exit decisions that determine the distribution of profit rates is unobserved. |
Keywords: | Quantal response, maximum entropy, Information-theoretic quantitative methods, incomplete information, link function, profit rate distribution |
JEL: | C10 C18 C70 C79 |
Date: | 2017–03 |
URL: | http://d.repec.org/n?u=RePEc:new:wpaper:1710&r=ecm |
By: | Thomas Leirvik (Nord University); Peter C.B. Phillips (Cowles Foundation, Yale University); Trude Storelvmo (Yale University) |
Abstract: | How sensitive is Earth’s climate to a given increase in atmospheric greenhouse gas (GHG) concentrations? This long-standing and fundamental question in climate science was recently analyzed by dynamic panel data methods using extensive spatiotemporal data of global surface temperatures, solar radiation, and GHG concentrations over the last half century to 2010 (Storelvmo et al, 2016). These methods revealed that atmospheric aerosol effects masked approximately one-third of the continental warming due to increasing GHG concentrations over this period, thereby implying greater climate sensitivity to GHGs than previously thought. The present study provides asymptotic theory justifying the use of these methods when there are stochastic process trends in both the global forcing variables, such as GHGs, and station-level trend effects from such sources as local aerosol pollutants. These asymptotics validate con dence interval construction for econometric measures of Earth’s transient climate sensitivity. The methods are applied to observational data and to data generated from three leading global climate models (GCMs) that are sampled spatio-temporally in the same way as the empirical observations. The fi ndings indicate that estimates of transient climate sensitivity produced by these GCMs lie within empirically determined con dence limits but that the GCMs uniformly underestimate the effects of aerosol induced dimming. The analysis shows the potential of econometric methods to calibrate GCM performance against observational data and to reveal the respective sensitivity parameters (GHG and non-GHG related) governing GCM temperature trends. |
Keywords: | Climate sensitivity, Cointegration, Common stochastic trend, Idiosyncratic trend, Spatio-temporal model, Unit root |
JEL: | C32 C33 |
Date: | 2017–02 |
URL: | http://d.repec.org/n?u=RePEc:cwl:cwldpp:2083&r=ecm |
By: | Peter C.B. Phillips (Cowles Foundation, Yale University) |
Abstract: | Professor T.W. Anderson passed away on September 17, 2016 at the age of 98 years after an astonishing career that spanned more than seven decades. Standing at the nexus of the statistics and economics professions, Ted Anderson made enormous contributions to both disciplines, playing a significant role in the birth of modern econometrics with his work on structural estimation and testing in the Cowles Commission during the 1940s, and educating successive generations through his brilliant textbook expositions of time series and multivariate analysis. This article is a tribute to his many accomplishments. |
Keywords: | T. W. Anderson, Cowles Commission, Limited information maximum likelihood, Multivariate analysis, Time series |
JEL: | A14 B23 |
Date: | 2016–12 |
URL: | http://d.repec.org/n?u=RePEc:cwl:cwldpp:2081&r=ecm |
By: | Timo Teräsvirta (Aarhus University and CREATES, C.A.S.E., Humboldt-Universität zu Berlin) |
Abstract: | Clive Granger had a wide range of reseach interests and has worked in a number of areas. In this work the focus is on his contributions to nonlinear time series models and modelling. Granger's contributions to a few other aspects of nonlinearity are reviewed as well. JEL Classification: C22, C51, C52, C53 |
Keywords: | cointegration, nonlinearity, nonstationarity, testing linearity |
Date: | 2701 |
URL: | http://d.repec.org/n?u=RePEc:aah:create:2017-04&r=ecm |
By: | David F. Hendry (University of Oxford); Peter C.B. Phillips (Cowles Foundation, Yale University) |
Abstract: | During his period at the LSE from the early 1960s to the mid 1980s, John Denis Sargan rose to international prominence and the LSE emerged as the world’s leading centre for econometrics. Within this context, we examine the life of Denis Sargan, describe his major research accomplishments, recount the work of his many doctoral students, and track this remarkable period that constitutes the Sargan era of econometrics at the LSE. |
Keywords: | John Denis Sargan, London School of Economics, Econometrics, Asymptotic theory, Small-sample distributions, Dynamic models, Autocorrelated errors, Empirical modelling, Doctoral training |
JEL: | A14 B23 |
Date: | 2017–03 |
URL: | http://d.repec.org/n?u=RePEc:cwl:cwldpp:2082&r=ecm |