
on Econometrics 
By:  Aknouche, Abdelhakim 
Abstract:  A unified quasimaximum likelihood (QML) estimation theory for stationary and nonstationary simple Markov bilinear (SMBL) models is proposed. Such models may be seen as generalized random coefficient autoregressions (GRCA) in which the innovation and the random coefficient processes are fully correlated. It is shown that the QML estimate (QMLE) for the SMBL model is always asymptotically Gaussian without assuming strict stationarity, meaning that there is no knife edge effect. The asymptotic variance of the QMLE is different in the stationary and nonstationary cases but is consistently estimated using the same estimator. A perhaps surprising result is that in the nonstationary domain, all SMBL parameters are consistently estimated in contrast with unstable GARCH and GRCA models where the QMLE of the conditional variance intercept is inconsistent. As a result, strict stationarity testing for the SMBL is studied. Simulation experiments and a real application to strict stationarity testing for some financial stock returns illustrate the theory in finite samples. 
Keywords:  Markov bilinear process, random coefficient process, stability, instability, Quasimaximum likelihood, knife edge effect, strict stationarity testing. 
JEL:  C10 C13 C18 C19 
Date:  2015 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:69572&r=ecm 
By:  Olivier Ledoit; Michael Wolf 
Abstract:  This paper deals with certain estimation problems involving the covariance matrix in large dimensions. Due to the breakdown of finitedimensional asymptotic theory when the dimension is not negligible with respect to the sample size, it is necessary to resort to an alternative framework known as largedimensional asymptotics. Recently, Ledoit and Wolf (2015) have proposed an estimator of the eigenvalues of the population covariance matrix that is consistent according to a meansquare criterion under largedimensional asymptotics. It requires numerical inversion of a multivariate nonrandom function which they call the QuEST function. The present paper explains how to numerically implement the QuEST function in practice through a series of six successive steps. It also provides an algorithm to compute the Jacobian analytically, which is necessary for numerical inversion by a nonlinear optimizer. Monte Carlo simulations document the effectiveness of the code. 
Keywords:  Largedimensional asymptotics, numerical optimization, random matrix theory, spectrum estimation 
JEL:  C13 C61 C87 
Date:  2016–01 
URL:  http://d.repec.org/n?u=RePEc:zur:econwp:215&r=ecm 
By:  Egger, Peter; Nigai, Sergey 
Abstract:  The measurement of trade costs and their effects on outcome is at the heart of a large quantitative literature in international economics. The majority of the recent significant contributions on the matter assumes that trade consists of a product of exportertimespecific factors, importertimespecific factors, and countrypairtimespecific trade costs, and that log trade costs are additively composed of a parameterized part and a residual part. We demonstrate that residual trade costs are relatively important and that the parameters on observable tradecost measures as well as the structural countrytimespecific variables or parameters are inevitably biased no matter of whether models are estimated by ordinary leastsquares or by exponentialfamily models. The reason is that the countryspecific variables are endogenous to the residual trade costs, regardless of whether they are captured by iterativelysolved structural terms or by countrytime fixed effects. As a result, quantifications of effects of trade costs and comparative static results are also biased. Apart from diagnosing this problem, the paper provides remedies for it. All of the proposed remedies involve binary indicator variables (fixed effects) only, and they are nonlinear in both variables and parameters. We therefore dub these approaches as ones of a constrained analysis of variance (CANOVA) of bilateral exports or imports. We propose saturated as well as unsaturated versions of the CANOVA approach for both crosssection and panel data. The saturated approach uses up all degrees of freedom and estimates as many parameters as there are observations on bilateral exports or imports, providing an exact decomposition of bilateral trade into trade costs and countryspecific parameters. The unsaturated approaches do not use up all degrees of freedom and estimate fewer parameters than there are observations on bilateral exports or imports, providing an approximate decomposition of bilateral trade into trade costs and countryspecific parameters. We demonstrate that with panel data an unsaturated model with exportertime, importertime and countrypair effects works quite well relative to both the saturated model as well as models with parameterized tradecost functions. The conclusions of the CANOVA models regarding the importance of trade costs for trade turn out to be substantially different from the ones implied by the conventional parameterized tradecostfunction models 
Keywords:  fixed effects estimation; gravity models; panel econometrics; structural general equilibrium models 
JEL:  C23 F14 
Date:  2015–02 
URL:  http://d.repec.org/n?u=RePEc:cpr:ceprdp:10427&r=ecm 
By:  Sokbae (Simon) Lee (Institute for Fiscal Studies); Myung Hwan Seo (Institute for Fiscal Studies); Youngki Shin (Institute for Fiscal Studies) 
Abstract:  We consider a highdimensional regression model with a possible changepoint due to a covariate threshold and develop the Lasso estimator of regression coefficients as well as the threshold parameter. Our Lasso estimator not only selects covariates but also selects a model between linear and threshold regression models. Under a sparsity assumption, we derive nonasymptotic oracle inequalities for both the prediction risk and the l1 estimation loss for regression coefficients. Since the Lasso estimator selects variables simultaneously, we show that oracle inequalities can be established without pretesting the existence of the threshold eect. Furthermore, we establish conditions under which the estimation error of the unknown threshold parameter can be bounded by a nearly n1 factor even when the number of regressors can be much larger than the sample size (n). We illustrate the usefulness of our proposed estimation method via Monte Carlo simulations and an application to real data. 
Date:  2014–05 
URL:  http://d.repec.org/n?u=RePEc:ifs:cemmap:26/14&r=ecm 
By:  Luisa Corrado (DEF and CEIS, Università di Roma "Tor Vergata" and University of Cambridge); Bernard Fingleton (University of Cambridge, Department of Land Economy) 
Abstract:  Network and spatial econometric models commonly embody a socalled W matrix which defines the connectivity between nodes of a network. The reason for the existence of W is that it facilitates parsimonious parametrization of internodal interaction which would otherwise be very difficult to achieve from a practical modelling perspective. The problem considered in this paper is the effect of misspecifying W. The paper demonstrates the effect in the context of two types of model, the dynamic spatial autoregressive panel model and the multilevel spatial autoregressive panel model, both of which include W as part of the model specification and use W in estimation. Monte Carlo results are presented showing the impact on bias and RMSE of misspecification of W. The paper highlights the need for careful attention to the correct structure of W in spatial econometric and network modelling. 
Keywords:  Networks, Multilevel Modelling, Fixed E¤ects, Dynamic Spatial Autoregressive Panel Model, Multilevel Spatial Autoregressive Panel Model 
Date:  2016–02–12 
URL:  http://d.repec.org/n?u=RePEc:rtv:ceisrp:369&r=ecm 
By:  Quiroz, Matias (Research Department, Central Bank of Sweden); Villani, Mattias (Linköping University); Kohn, Robert (University of New South Wales) 
Abstract:  We propose a generic Markov Chain Monte Carlo (MCMC) algorithm to speed up computations for datasets with many observations. A key feature of our approach is the use of the highly efficient difference estimator from the survey sampling literature to estimate the loglikelihood accurately using only a small fraction of the data. Our algorithm improves on the O(n) complexity of regular MCMC by operating over local data clusters instead of the full sample when computing the likelihood. The likelihood estimate is used in a Pseudo marginal framework to sample from a perturbed posterior which is within O(m^1/2) of the true posterior, where m is the subsample size. The method is applied to a logistic regression model to predict firm bankruptcy for a large data set. We document a significant speed up in comparison to the standard MCMC on the full dataset. 
Keywords:  Bayesian inference; Markov Chain Monte Carlo; Pseudomarginal MCMC; estimated likelihood; GLM for large data. 
JEL:  C11 C13 C15 C83 
Date:  2015–08–01 
URL:  http://d.repec.org/n?u=RePEc:hhs:rbnkwp:0306&r=ecm 
By:  van Oest, R.D.; Franses, Ph.H.B.F. 
Abstract:  __Abstract__ Crucial inference for the hierarchical linear model concerns the null hypothesis of no random slope. We argue that the usually applied statistical test suffers from the socalled Davies problem, that is, a nuisance parameter is only identified under the alternative. We propose an easytoimplement methodology that exploits this property. We provide the relevant critical values and demonstrate through simulations that our new methodology has better power properties. 
Keywords:  hierarchical linear model, random effects, slope variance, Davies problem 
Date:  2015–01–05 
URL:  http://d.repec.org/n?u=RePEc:ems:eureir:78063&r=ecm 
By:  Tue Gørgens (Australian National University); Dean Hyslop (Motu Economic and Public Policy Research) 
Abstract:  This paper examines dynamic binary response and multispell duration model approaches to analyzing longitudinal discretetime binary outcomes. Prototypical dynamic binary response models specify loworder Markovian state dependence and restrict the effects of observed and unobserved heterogeneity on the probability of transitioning into and out of a state to have the same magnitude and opposite signs. In contrast, multispell duration models typically allow for statespecific duration dependence, and allow the probability of entry into and exit from a state to vary flexibly. We show that both of these approaches are special cases within a general framework. We compare specific dynamic binary response and multispell duration models empirically using a case study of poverty transitions. In this example, both the specification of state dependence and the restrictions on the statespecific transition probabilities imposed by the simpler dynamic binary response models are severely rejected against the more flexible multispell duration models. Consistent with recent literature, we conclude that the standard dynamic binary response model is unacceptably restrictive in this context. 
Keywords:  Panel data, transition data, binary response, duration analysis, event history analysis, initial conditions, random effects. 
JEL:  C33 C35 C41 C51 
Date:  2016–02 
URL:  http://d.repec.org/n?u=RePEc:mtu:wpaper:16_01&r=ecm 
By:  Makieła, Kamil 
Abstract:  The paper investigates Bayesian approach to estimating generalized true randomeffects model (GTRE) via Gibbs sampling. Simulation results show that under properly defined priors for transient and persistent inefficiency components the posterior characteristics of the GTRE model are well approximated using simple Gibbs sampling procedure. No model reparametrization is required and if such is made it leads to much lower numerical efficiency. The new model allows us to make more reasonable assumptions as regards prior inefficiency distribution and appears more reliable in handling especially nuisance datasets. Empirical application furthers the research into stochastic frontier analysis using GTRE by examining the relationship between inefficiency terms in GTRE, true randomeffects (TRE), generalized stochastic frontier and a standard stochastic frontier model. 
Keywords:  generalized true randomeffects model, stochastic frontier analysis, Bayesian inference, cost efficiency, firm heterogeneity, transient and persistent efficiency 
JEL:  C11 C23 C51 D24 
Date:  2016–01–19 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:69389&r=ecm 
By:  Wang, Xuexin 
Abstract:  In this paper, we propose a new class of tests for overidentifying restrictions in moment condition models. The tests in this new class are quite easy to com pute. They avoid the complicated saddle point problem in generalized empirical likelihood (GEL) estimation, only a √n consistent estimator, where n is the sample size, is needed. In addition to discussing their firstorder properties, we establish that under some regularity conditions these tests share the same higher order properties as GEL overidentifying tests, given proper consistent estimators. Monte Carlo simulation study shows that the new class of tests of overidentifying restrictions has better finite sample performance than the twostep GMM overidentification test, and compares well to several potential alternatives in terms of overall performance. 
Keywords:  Generalized Empirical Likelihood (GEL); Tests for Overidentifying Restrictions; C(alpha) Type Tests; High Order Equivalence; 
JEL:  C12 C20 
Date:  2016–01–24 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:69004&r=ecm 
By:  Sucarrat, Genaro; Grønneberg, Steffen 
Abstract:  The probability of an observed financial return being equal to zero is not necessarily zero. This can be due to price discreteness or rounding error, liquidity issues (e.g. low trading volume), market closures, data issues (e.g. data imputation due to missing values), characteristics specific to the market, and so on. Moreover, the zero probability may change and depend on market conditions. In standard models of return volatility, however, e.g. ARCH, SV and continuous time models, the zero probability is zero, constant or both. We propose a new class of models that allows for a timevarying zero probability, and which can be combined with standard models of return volatility: They are nested and obtained as special cases when the zero probability is constant and equal to zero. Another attraction is that the return properties of the new class (e.g. volatility, skewness, kurtosis, ValueatRisk, Expected Shortfall) are obtained as functions of the underlying volatility model. The new class allows for autoregressive conditional dynamics in both the zero probability and volatility specifications, and for additional covariates. Simulations show parameter and risk estimates are biased if zeros are not appropriately handled, and an application illustrates that riskestimates can be substantially biased in practice if the timevarying zero probability is not accommodated. 
Keywords:  Financial return, volatility, zeroinflated return, GARCH, logGARCH, ACL 
JEL:  C01 C22 C32 C51 C52 C58 
Date:  2016–01–17 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:68931&r=ecm 
By:  Francisco Blasques (VU University Amsterdam, the Netherlands); Paolo Gorgi (VU University Amsterdam, the Netherlands, University of Padua, Italy); Siem Jan Koopman (VU University Amsterdam, the Netherlands, Aarhus University, Denmark); Olivier Wintenberger (University of Copenhagen, Denmark, Sorbonne Universités, UPMC University Paris, Sorbonne Universities, France) 
Abstract:  We revisit Wintenberger (2013) on the continuous invertibility of the EGARCH(1,1) model. We note that the definition of continuous invertibility adopted in Wintenberger (2013) may not always be sufficient to deliver strong consistency of the QMLE. We also take the opportunity to provide other small clarifications and additions. 
Keywords:  invertibility, quasimaximum likelihood estimator, volatility models 
JEL:  C01 C22 C51 
Date:  2015–12–11 
URL:  http://d.repec.org/n?u=RePEc:tin:wpaper:20150131&r=ecm 
By:  JeanMarie Dufour; Alain Trognon; Purevdorj Tuvaandorj 
Abstract:  We study the invariance properties of various test criteria which have been proposed for hypothesis testing in the context of incompletely specified models, such as models which are formulated in terms of estimating functions (Godambe, 1960, Ann. Math. Stat.) or moment conditions and are estimated by generalized method of moments (GMM) procedures (Hansen, 1982, Econometrica), and models estimated by pseudolikelihood (Gouri´eroux, Monfort and Trognon, 1984, Econometrica) and Mestimation methods. The invariance properties considered include invariance to (possibly nonlinear) hypothesis reformulations and reparameterizations. The test statistics examined include Waldtype, LRtype, LMtype, scoretype, and C()−type criteria. Extending the approach used in Dagenais and Dufour (1991, Econometrica), we show first that all these test statistics except the Waldtype ones are invariant to equivalent hypothesis reformulations (under usual regularity conditions), but all five of them are not generally invariant to model reparameterizations, including measurement unit changes in nonlinear models. In other words, testing two equivalent hypotheses in the context of equivalent models may lead to completely different inferences. For example, this may occur after an apparently innocuous rescaling of some model variables. Then, in view of avoiding such undesirable properties, we study restrictions that can be imposed on the objective functions used for pseudolikelihood (or Mestimation) as well as the structure of the test criteria used with estimating functions and GMM procedures to obtain invariant tests. In particular, we show that using linear exponential pseudolikelihood functions allows one to obtain invariant scoretype and C()−type test criteria, while in the context of estimating function (or GMM) procedures it is possible to modify a LRtype statistic proposed by Newey and West (1987, Int. Econ. Rev.) to obtain a test statistic that is invariant to general reparameterizations. The invariance associated with linear exponential pseudolikelihood functions is interpreted as a strong argument for using such pseudolikelihood functions in empirical work. 
Keywords:  Testing, invariance, hypothesis reformulation, reparameterization, measurement unit, estimating function, generalized method of moment (GMM), pseudolikelihood, Mestimator; Linear exponential model, Nonlinear model, Wald test, Likelihood ratio test, score test, lagrange multiplier test, C(∝) test., 
JEL:  C3 C12 
Date:  2015–06–24 
URL:  http://d.repec.org/n?u=RePEc:cir:cirwor:2015s27&r=ecm 
By:  Bormann, Carsten; Schaumburg, Julia; Schienle, Melanie 
Abstract:  In practice, multivariate dependencies between extreme risks are often only assessed in a pairwise way. We propose a test for detecting situations when such pairwise measures are inadequate and give incomplete results. This occurs when a significant portion of the multivariate dependence structure in the tails is of higher dimension than two. Our test statistic is based on a decomposition of the stable tail dependence function describing multivariate tail dependence. The asymptotic properties of the test are provided and a bootstrap based finite sample version of the test is proposed. A simulation study documents good size and power properties of the test including settings with timeseries components and factor models. In an application to stock indices for noncrisis times, pairwise tail models seem appropriate for global markets while the test finds them not admissible for the tightly interconnected European market. From 2007/08 on, however, higher order dependencies generally increase and require a multivariate tail model in all cases. 
Keywords:  decomposition of multivariate tail dependence,multivariate extreme values,stable tail dependence function,extreme dependence modeling 
JEL:  C01 C46 C58 
Date:  2016 
URL:  http://d.repec.org/n?u=RePEc:zbw:kitwps:80&r=ecm 
By:  Wei Cui; Wolfgang K. Härdle; Weining Wang; 
Abstract:  Estimating natural rate of unemployment (NAIRU) is important for understanding the joint dynamics of unemployment, in ation, and in Nation expectation. However, existing literature falls short in endogenizing inflation expectation together with NAIRU in a model consistent way. We develop and estimate a structural model with forward and backward looking Phillips curve. Inflation expectation is treated as a function of state variables and we use survey data as its observations. We find out that the estimated NAIRU using our methodology tracks the unemployment process closely except for the high in ation period around 1970. Moreover, the estimated Bayesian credible sets are narrower and our model leads to better inflation and unemployment forecasts. These results suggest that monetary policy was very effective during the sample periods and there was not much room for policy improvement.. 
Keywords:  NAIRU; Inflation Expectation; Targeting 
JEL:  C32 E23 E24 
Date:  2015–02 
URL:  http://d.repec.org/n?u=RePEc:hum:wpaper:sfb649dp2015010&r=ecm 
By:  JeanFrançois Richard 
Abstract:  We propose a generic algorithm for numerically accurate likelihood evaluation of a broad class of spatial models characterized by a highdimensional latent Gaussian process and nonGaussian response variables.The class of models under consideration includes specifications for discrete choices, event counts and limited dependent variables (truncation, censoring, and sample selection) among others.Our algorithm relies upon a novel implementation of Efficient Importance Sampling (EIS) specifically designed to exploit typical sparsity of highdimensional spatial precision (or covariance) matrices. It is numerically very accurate and computationally feasible even for very highdimensional latent processes. Thus Maximum Likelihood (ML) estimation of highdimensional nonGaussian spatial models, hitherto considered to be computationally prohibitive, becomes feasible. We illustrate our approach with ML estimation of a spatial probit for US presidential voting decisions and spatial count data models (Poisson and Negbin) for firm location choices. 
Date:  2015–01 
URL:  http://d.repec.org/n?u=RePEc:pit:wpaper:5778&r=ecm 
By:  Arthur Charpentier (Université du Québec à Montréal  UQAM (CANADA)  Université du Québec à Montréal  UQAM (CANADA), CREM  Centre de Recherche en Economie et Management  UR1  Université de Rennes 1  Université de Caen BasseNormandie  CNRS  Centre National de la Recherche Scientifique); Ewen Gallic (CREM  Centre de Recherche en Economie et Management  UR1  Université de Rennes 1  Université de Caen BasseNormandie  CNRS  Centre National de la Recherche Scientifique) 
Abstract:  In this paper, we investigate a technique inspired by Ripley’s circumference method to correct bias of density estimation of edges (or frontiers) of regions. The idea of the method was theoretical and difficult to implement. We provide a simple technique – based of properties of Gaussian kernels – to efficiently compute weights to correct border bias on frontiers of the region of interest, with an automatic selection of an optimal radius for the method. We illustrate the use of that technique to visualize hot spots of car accidents and campsite locations, as well as location of bike thefts. 
Keywords:  visualization,spatial process,GIS,Kernel density estimation,polygons,Ripley’s circumference method,Border bias,edge correction,frontier 
Date:  2015 
URL:  http://d.repec.org/n?u=RePEc:hal:journl:halshs01238499&r=ecm 
By:  Neil Shephard; Justin Yang; Mark Podolskij; Robert Stelzer; S Thorbjornsen 
URL:  http://d.repec.org/n?u=RePEc:qsh:wpaper:360826&r=ecm 
By:  Meriem Rjiba, Meriem; Tsagris, Michail; Mhalla, Hedi 
Abstract:  We evaluate the predictive performance of a variety of valueatrisk (VaR) models for a portfolio consisting of five assets. Traditional VaR models such as historical simulation with bootstrap and filtered historical simulation methods are considered. We suggest a new method for estimating Value at Risk: the filtered historical simulation GJRGARCH method based on bootstrapping the standardized GJRGARCH residuals. The predictive performance is evaluated in terms of three criteria, the test of unconditional coverage, independence and conditional coverage and the quadratic loss function suggested. The results show that classical methods are inefficient under moderate departures from normality and that the new method produces the most accurate forecasts of extreme losses. 
Keywords:  Value at Risk, bootstrap, GARCH 
JEL:  C15 G17 
Date:  2015 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:68842&r=ecm 
By:  Lilia Maliar; Serguei Maliar; John B. Taylor (Stanford University); Inna Tsener 
Abstract:  We study a class of infinitehorizon nonlinear dynamic economic models in which preferences, technology and laws of motion for exogenous variables can change over time either deterministically or stochastically, according to a Markov process with timevarying transition probabilities, or both. The studied models are nonstationary in the sense that the decision and value functions are timedependent, and they cannot be generally solved by conventional solution methods. We introduce a quantitative framework, called extended function path (EFP), for calibrating, solving, simulating and estimating such models. We apply EFP to analyze a collection of challenging applications that do not admit stationary Markov equilibria, including growth models with anticipated parameters shifts and drifts, unbalanced growth under capital augmenting technological progress, anticipated regime switches, deterministically timevarying volatility and seasonal fluctuations. Also, we show an example of estimation and calibration of parameters in an unbalanced growth model using data on the U.S. economy. Examples of MATLAB code are provided. 
Keywords:  nonstationary models, unbalanced growth, time varying transition probabilities, time varying parameters, anticipated shock, shooting method, parameter shift, parameter drift, regime switch, stochastic volatility, capital augmenting, seasonality, Fair and Taylor, extended path, Smolyak method 
JEL:  C61 C63 C68 E31 E52 
Date:  2015–03 
URL:  http://d.repec.org/n?u=RePEc:hoo:wpaper:15105&r=ecm 