
on Econometrics 
By:  Su Liangjun (Singapore Management University ); Tadao Hoshino (Waseda University ) 
Abstract:  In this paper, we consider sieve instrumental variable quantile regression (IVQR) estimation of functional coefficient models where the coefficients of endogenous regressors are unknown functions of some exogenous covariates. We approximate the unknown functional coefficients by some basis functions and estimate them by the IVQR technique. We establish the uniform consistency and asymptotic normality of the estimators of the functional coefficients. Based on the sieve estimates, we propose a nonparametric specification test for the constancy of the functional coefficients, study its asymptotic properties under the null hypothesis, a sequence of local alternatives and global alternatives, and propose a wildbootstrap procedure to obtain the bootstrap pvalues. A set of Monte Carlo simulations are conducted to evaluate the finite sample behavior of both the estimator and test statistic. As an empirical illustration of our theoretical results, we present the estimation of quantile Engel curves. 
Keywords:  Endogeneity; Functional coefficient; Heterogeneity; Instrumental variable; Panel data; Sieve estimation; Specification test; Structural quantile function 
JEL:  C12 C13 C14 C21 C23 C26 
Date:  2015–02 
URL:  http://d.repec.org/n?u=RePEc:siu:wpaper:012015&r=ecm 
By:  Chudik, Alexander (Federal Reserve Bank of Dallas ); Mohaddes, Kamiar (Girton College ); Pesaran, M. Hashem (University of Southern Califormia ); Raissi, Mehdi (International Monetary Fund ) 
Abstract:  This paper develops a crosssectionally augmented distributed lag (CSDL) approach to the estimation of longrun effects in large dynamic heterogeneous panel data models with crosssectionally dependent errors. The asymptotic distribution of the CSDL estimator is derived under coefficient heterogeneity in the case where the time dimension (T) and the crosssection dimension (N) are both large. The CSDL approach is compared with more standard panel data estimators that are based on autoregressive distributed lag (ARDL) specifications. It is shown that unlike the ARDL type estimator, the CSDL estimator is robust to misspecification of dynamics and error serial correlation. The theoretical results are illustrated with small sample evidence obtained by means of Monte Carlo simulations, which suggest that the performance of the CSDL approach is often superior to the alternative panel ARDL estimates particularly when T is not too large and lies in the range of 30≤T 
JEL:  C23 
Date:  2015–01–01 
URL:  http://d.repec.org/n?u=RePEc:fip:feddgw:223&r=ecm 
By:  Kyriacou, Maria ; Phillips, Peter C.B. ; Rossi, Francesca 
Abstract:  Ordinary least squares (OLS) is wellknown to produce an inconsistent estimator of the spatial parameter in pure spatial autoregression (SAR). This paper explores the potential of indirect inference to correct the inconsistency of OLS. Under broad conditions, it is shown that indirect inference (II) based on OLS produces consistent and asymptotically normal estimates in pure SAR regression. The II estimator is robust to departures from normal disturbances and is computationally straightforward compared with pseudo Gaussian maximum likelihood (PML). Monte Carlo experiments based on various specifications of the weighting matrix confirm that the indirect inference estimator displays little bias even in very small samples and gives overall performance that is comparable to the Gaussian PML. <br><br> Keywords; bias, binding function, inconsistency, indirect inference, spatial autoregression 
Date:  2014–09–22 
URL:  http://d.repec.org/n?u=RePEc:stn:sotoec:1418&r=ecm 
By:  Russel Davidson (Department of Economics and CIREQ McGill University ); Andrea Monticini (Dipartimento di Economia e Finanza, Università Cattolica del Sacro Cuore ) 
Abstract:  In many, if not most, econometric applications, it is impossible to estimate consistently the elements of the whitenoise process or processes that underlie the DGP. A common example is a regression model with heteroskedastic and/or autocorrelated disturbances,where the heteroskedasticity and autocorrelation are of unknown form. A particular version of the wild bootstrap can be shown to work very well with many models, both univariate and multivariate, in the presence of heteroskedasticity. Nothing comparable appears to exist for handling serial correlation. Recently, there has been proposed something called the dependent wild bootstrap. Here, we extend this new method, and link it to the wellknown HAC covariance estimator, in much the same way as one can link the wild bootstrap to the HCCME. It works very well even with sample sizes smaller than 50, and merits considerable further study. 
Keywords:  Bootstrap, time series, wild bootstrap, dependent wild bootstrap,HAC covariance matrix estimator 
JEL:  C12 C22 C32 
Date:  2014–03 
URL:  http://d.repec.org/n?u=RePEc:ctc:serie1:def012&r=ecm 
By:  Arthur Charpentier (Université du Québec à Montréal CREM & GERAD ); Emmanuel Flachaire (AixMarseille University (AixMarseille School of Economics), CNRS & EHESS Institut Universitaire de France ) 
Abstract:  Standard kernel density estimation methods are very often used in practice to estimate density function. It works well in numerous cases. However, it is known not to work so well with skewed, multimodal and heavytailed distributions. Such features are usual with income distributions, defined over the positive support. In this paper, we show that a preliminary logarithmic transformation of the data, combined with standard kernel density estimation methods, can provide a much better fit of the density estimation. 
Keywords:  nonparametric density estimation, heavytail, income distribution, data transformation, lognormal kernel 
JEL:  C15 
Date:  2015–02 
URL:  http://d.repec.org/n?u=RePEc:aim:wpaimx:1506&r=ecm 
By:  Ahmad Farid Osman ; Maxwell L. King 
Abstract:  In There is evidence that exponential smoothing methods as well as time varying parameter models perform relatively well in forecasting comparisons. The aim of this paper is to introduce a new forecasting technique by integrating the exponential smoothing model with regressors whose coefficients are time varying. In doing this, we construct an exponential smoothing model with regressors by extending Holt's linear exponential smoothing model. We then translate it into an equivalent state space structure so that the parameters can be estimated via the maximum likelyhood estimation procedure. Due to the potential problem in the updating equation for the regressor coefficients when the change in regressor is too small, we propose an alternative structure of the state space model which allows the updating process to be put on hold until sufficient information is available. An empirical study of forecast accuracy shows that the new model performs better than the existing exponential smoothing model as well as the linear regression model. 
Keywords:  State space model, Single source of error, Time varying parameter, Time series, Forecast accuracy 
JEL:  C51 C53 
Date:  2015 
URL:  http://d.repec.org/n?u=RePEc:msh:ebswps:20152&r=ecm 
By:  Shonosuke Sugasawa (Graduate School of Economics, The University of Tokyo ); Tatsuya Kubokawa (Faculty of Economics, The University of Tokyo ) 
Abstract:  The BoxCox transformation is applied to linear mixed models for analyzing positive and clustered data. The problem is that the maximum likelihood estimator of the transformation parameter is not consistent. To fix it, we suggest a simple and consistent estimator for the transformation parameter based on the moment method. The consistent estimator is used to construct consistent estimators of the parameters involved in the model and to provide an empirical predictor of a linear combination of both fixed and random effects. Secondorder accurate prediction intervals for measuring uncertainty of the predictor are derived. Finally, the performance of the proposed procedure is investigated through simulation and empirical studies.  
Date:  2015–02 
URL:  http://d.repec.org/n?u=RePEc:tky:fseres:2015cf957&r=ecm 
By:  Omay, Tolga ; Hasanov, Mubariz ; Emirmahmutoglu, Furkan 
Abstract:  In this study, we propose a new unit root test procedure that allows for both gradual structural break and asymmetric nonlinear adjustment towards the equilibrium level. Smallsample properties of the new test are examined through MonteCarlo simulations. The simulation results suggest that the new test has satisfactory size and power properties. We then apply this new test along with other unit root tests to examine stationarity properties of real exchange rate series of the sample countries. Our test rejects the null of unit root in more cases when compared to alternative tests. Overall, we find that the PPP proposition holds in majority of the European countries examined in this paper. 
Keywords:  Smooth Structural Break; Nonlinear Unit Root test; PPP 
JEL:  C12 C22 F41 
Date:  2014–09–03 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:62335&r=ecm 
By:  Jia Chen ; Degui Li ; Oliver Linton ; Zudi Lu 
Abstract:  Dynamic portfolio choice has been a central and essential objective for institutional investors in active asset management. In this paper, we study the dynamic portfolio choice with multiple conditioning variables, where the number of the conditioning variables can be either fixed or diverging to infinity at certain polynomial rate of the sample size. We propose a novel datadriven method to estimate the optimal portfolio choice, motivated by the model averaging marginal regression approach suggested by Li, Linton and Lu (2015). More specifically, in order to avoid the curse of dimensionality associated with multivariate nonparametric regression problem and to make it practically implementable, we first estimate the marginal optimal portfolio choice by maximising the conditional utility function for each univariate conditioning variable, and then construct the joint dynamic optimal portfolio through the weighted average of the marginal optimal portfolio across all the conditioning variables. Under some regularity conditions, we establish the large sample properties for the developed portfolio choice procedure. Both simulation studies and empirical application well demonstrate the performance of the proposed methodology. 
Keywords:  Conditioning variables, kernel smoothing, model averaging, portfolio choice, utility function 
JEL:  C13 C14 C32 
Date:  2015–02 
URL:  http://d.repec.org/n?u=RePEc:yor:yorken:15/01&r=ecm 
By:  Biørn, Erik (Dept. of Economics, University of Oslo ) 
Abstract:  The measurement error problem in linear time series regression, with focus on the impact of error memory, modeled as finiteorder MA processes, is considered. Three prototype models, two bivariate and one univariate ARMA, and ways of handling the problem by using instrumental variables (IVs) are discussed as examples. One has a bivariate regression equation that is static, although with dynamics, entering via the memory of its latent variables. The examples illustrate how 'structural dynamics' interacting with measurement error memory create bias in Ordinary Least Squares (OLS) and illustrate the potential of IV estimation procedures. Supplementary Monte Carlo simulations are provided for two of the example models. 
Keywords:  Errors in variables; ARMA; Error memory; Simultaneity bias; Attenuation; Monte Carlo 
JEL:  C22 C26 C32 C36 C52 C53 
Date:  2014–12–30 
URL:  http://d.repec.org/n?u=RePEc:hhs:osloec:2014_028&r=ecm 
By:  Seojeong Lee (School of Economics, Australian School of Business, the University of New South Wales ) 
Abstract:  Under treatment effect heterogeneity, an instrument identifies the instrumentspecific local average treatment effect (LATE). If a regression model is estimated by the twostage least squares (2SLS) using multiple instruments, then 2SLS is consistent for a weighted average of different LATEs. In practice, a rejection of the overidentifying restrictions test can indicate that there are more than one LATE. What is often overlooked in the literature is that the postulated moment condition evaluated at the 2SLS estimand does not hold unless those LATEs are the same. If so, the conventional heteroskedasticityrobust variance estimator would be inconsistent. However, 2SLS standard errors based on the conventional variance estimator have been reported even when the overidentifying restrictions test is rejected. I propose a consistent estimator for the asymptotic variance of 2SLS by using the result of Hall and Inoue (2003) on misspecified moment condition models. This can be used to correctly calculate the standard errors regardless of whether there are more than one LATE or not. 
Keywords:  local average treatment effect, treatment heterogeneity, twostage least squares, variance estimator, model misspecification 
JEL:  C13 C31 C36 
Date:  2015–01 
URL:  http://d.repec.org/n?u=RePEc:swe:wpaper:201501&r=ecm 
By:  David M. Kaplan (Department of Economics, University of MissouriColumbia ); Matt Goldman 
Abstract:  Using and extending fractional order statistic theory, we characterize the O(nâˆ’1) coverage probability error of the previously proposed confidence intervals for population quantiles using Lstatistics as endpoints in Hutson (1999). We derive an analytic expression for the nâˆ’1 term, which may be used to calibrate the nominal coverage level to get O(nâˆ’3/2log(n)) coverage error. Asymptotic power is shown to be optimal. Using kernel smoothing, we propose a related method for nonparametric inference on conditional quantiles. This new method compares favorably with asymptotic normaity and bootstrap methods in theory and in simulations. Code is provided for both unconditional and conditional inference. 
Keywords:  fractional order statistics, highorder accuracy, inferenceoptimal band width, kernel smoothing. 
JEL:  C21 
Date:  2015–02–09 
URL:  http://d.repec.org/n?u=RePEc:umc:wpaper:1502&r=ecm 
By:  Florian Ziel 
Abstract:  Due to the increasing impact of big data, shrinkage algorithms are of great importance in almost every area of statistics, as well as in time series analysis. In current literature the focus is on lasso type algorithms for autoregressive time series models with homoscedastic residuals. In this paper we present an iteratively reweighted adaptive lasso algorithm for the estimation of time series models under conditional heteroscedasticity in a highdimensional setting. We analyse the asymptotic behaviour of the resulting estimator that turns out to be significantly better than its homoscedastic counterpart. Moreover, we discuss a special case of this algorithm that is suitable to estimate multivariate ARARCH type models in a very fast fashion. Several model extensions like periodic ARARCH or ARMAGARCH are discussed. Finally, we show different simulation results and an application to electricity price and load data. 
Date:  2015–02 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1502.06557&r=ecm 
By:  Francesca Monti (Bank of England ; Centre for Macroeconomics (CFM) ) 
Abstract:  This paper proposes a method for detecting the sources of misspecification in a DSGE model based on testing, in a datarich environment, the exogeneity of the variables of the DSGE with respect to some auxiliary variables. Finding evidence of nonexogeneity implies misspecification, but finding that some specific variables help predict certain shocks can shed light on the dimensions along which the model is misspecified. Forecast error variance decomposition analysis then helps assess the relevance of the missing channels. The paper puts the proposed methodology to work both in a controlled experiment  by running a Monte Carlo simulations with a known DGP  and using a stateoftheart model and US data up to 2011. 
Keywords:  DSGE Models, Model Misspecification, Bayesian Analysis 
JEL:  C32 C52 
Date:  2015–01 
URL:  http://d.repec.org/n?u=RePEc:cfm:wpaper:1505&r=ecm 
By:  Süß, Philipp 
Abstract:  The following note proposes a simple procedure to estimate k parameters of interest in a linear model with potentially k conditionally endogenous variables of interest and m endogenous control variables in the presence of at least one instrumental variable under the assumption of conditional mean independence. 
Keywords:  Instrumental variables; Conditional independence assumption; Underidentified model 
JEL:  C26 
Date:  2015–02–10 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:62030&r=ecm 
By:  Omay, Tolga ; Yildirim, Dilem 
Abstract:  We develop unit root tests that allow under the alternative hypothesis for a smooth transition between deterministic linear trends, around which stationary asymmetric adjustment may occur by employing exponential smooth transition autoregressive (ESTAR) models The small sample properties of the newly developed test are briefly investigated and an application for investigating the PPP hypothesis for Argentina is provided. 
Keywords:  Smooth Break; Nonlinear Unit Root Test; PPP 
JEL:  C12 C22 F4 
Date:  2013–05–10 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:62334&r=ecm 
By:  Anindya Banerjee ; Massimiliano Marcellino ; Igor Masten 
Abstract:  The Factoraugmented Error Correction Model (FECM) generalizes the factoraugmented VAR (FAVAR) and the Error Correction Model (ECM), combining errorcorrection, cointegration and dynamic factor models. It uses a larger set of variables compared to the ECM and incorporates the longrun information lacking from the FAVAR because of the latter's specification in differences. In this paper we review the specification and estimation of the FECM, and illustrate its use for forecasting and structural analysis by means of empirical applications based on Euro Area and US data. 
Keywords:  Dynamic Factor Models, Cointegration, Structural Analysis, Factoraugmented Error Correction Models, FAVAR 
JEL:  C32 E17 
Date:  2015–01 
URL:  http://d.repec.org/n?u=RePEc:bir:birmec:1503&r=ecm 
By:  Damien Challet 
Abstract:  Counting the number of local extrema of the cumulative sum of data points yields the Rtest, a new singlesample nonparametric test. Numeric simulations indicate that the Rtest is more powerful than Student's ttest for semiheavy and heavytailed distributions, equivalent for Gaussian variables and slightly worse for uniformly distributed variables. Finally the Rtest has a greater power than the singlesample Wilcoxon signedrank test in most cases. 
Date:  2015–02 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1502.05367&r=ecm 
By:  Stefano Lamperti 
Abstract:  Simulated models suffer intrinsically from validation and comparison problems. The choice of a suitable indicator quantifying the distance between the model and the data is pivotal to model selection. However, how to validate and discriminate between alternative models is still an open problem calling for further investigation, especially in light of the increasing use of simulations in social sciences. In this paper, we present an information theoretic criterion to measure how close models' synthetic output replicates the properties of observable time series without the need to resort to any likelihood function or to impose stationarity requirements. The indicator is sufficiently general to be applied to any kind of model able to simulate or predict time series data, from simple univariate models such as Auto Regressive Moving Average (ARMA) and Markov processes to more complex objects including agentbased or dynamic stochastic general equilibrium models. More specifically, we use a simple function of the Ldivergence computed at different block lengths in order to select the model that is better able to reproduce the distributions of time changes in the data. To evaluate the Ldivergence, probabilities are estimated across frequencies including a correction for the systematic bias. Finally, using a known data generating process, we show how this indicator can be used to validate and discriminate between different models providing a precise measure of the distance between each of them and the data. 
Keywords:  Simulations, Empirical Validation, Time Series, Agent Based Models 
Date:  2015–02–24 
URL:  http://d.repec.org/n?u=RePEc:ssa:lemwps:2015/02&r=ecm 
By:  Stéphane Mussard ; Fattouma SouissiBenrejab 
Abstract:  Data contamination and excessive correlations between regressors (multicollinearity) constitute a standard and major problem in econometrics. Two techniques enable solving these problems, in separate ways: the Gini regression for the former, and the PLS (partial least squares) regression for the latter. GiniPLS regressions are proposed in order to treat extreme values and multicollinearity simultaneously. 
Date:  2015–02 
URL:  http://d.repec.org/n?u=RePEc:lam:wpaper:1503&r=ecm 
By:  Task Force Members Include: Lilli Japec ; Frauke Kreuter ; Marcus Berg ; Paul Biemer ; Paul Decker ; Cliff Lampe ; Julia Lane ; Cathy O'Neil ; Abe Usher 
Abstract:  In recent years we have seen an increase in the amount of statistics in society describing different phenomena based on so called Big Data. The term Big Data is used for a variety of data as explained in the report, many of them characterized not just by their large volume, but also by their variety and velocity, the organic way in which they are created, and the new types of processes needed to analyze them and make inference from them. The change in the nature of the new types of data, their availability, the way in which they are collected, and disseminated are fundamental. The change constitutes a paradigm shift for survey research. 
Keywords:  AAPOR, Big Data 
JEL:  C 
Date:  2015–02–12 
URL:  http://d.repec.org/n?u=RePEc:mpr:mprres:4eb9b798fd5b42a8b53a9249c7661dd8&r=ecm 
By:  Junior Maih 
Abstract:  In an environment where economic structures break, variances change, distributions shift, conventional policies weaken and past events tend to reoccur, economic agents have to form expectations over different regimes. This makes the regimeswitching dynamic stochastic general equilibrium (RSDSGE) model the natural framework for analyzing the dynamics of macroeconomic variables. We present efficient solution methods for solving this class of models, allowing for the transition probabilities to be endogenous and for agents to react to anticipated events. The solution algorithms derived use a perturbation strategy which, unlike what has been proposed in the literature, does not rely on the partitioning of the switching parameters. These algorithms are all implemented in RISE, a flexible objectoriented toolbox that can easily integrate alternative solution methods. We show that our algorithms replicate various examples found in the literature. Among those is a switching RBC model for which we present a thirdorder perturbation solution. 
Keywords:  DSGE, Markov switching, Sylvester equation, Newton algorithm, perturbation, matrix polynomial 
Date:  2014–12 
URL:  http://d.repec.org/n?u=RePEc:bny:wpaper:0028&r=ecm 
By:  Alexander Schnurr 
Abstract:  We introduce two types of ordinal pattern dependence between time series. Positive (resp. negative) ordinal pattern dependence can be seen as a nonparamatric and in particular nonlinear counterpart to positive (resp. negative) correlation. We show in an explorative study that both types of this dependence show up in real world financial data. 
Date:  2015–01 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1502.07321&r=ecm 