nep-ecm New Economics Papers
on Econometrics
Issue of 2015‒02‒28
23 papers chosen by
Sune Karlsson
Örebro universitet

  1. Sieve Instrumental Variable Quantile Regression Estimation of Functional Coefficient Models By Su Liangjun ; Tadao Hoshino
  2. Long-run effects in large heterogenous panel data models with cross-sectionally correlated errors By Chudik, Alexander ; Mohaddes, Kamiar ; Pesaran, M. Hashem ; Raissi, Mehdi
  3. Indirect inference in spatial autoregression By Kyriacou, Maria ; Phillips, Peter C.B. ; Rossi, Francesca
  4. Heteroskedasticity-and-Autocorrelation-Consistent Bootstrapping By Russel Davidson ; Andrea Monticini
  5. Log-Transform Kernel Density Estimation of Income Distribution By Arthur Charpentier ; Emmanuel Flachaire
  6. A new approach to forecasting based on exponential smoothing with independent regressors By Ahmad Farid Osman ; Maxwell L. King
  7. "Box-Cox Transformed Linear Mixed Models for Positive-Valued and Clustered Data" By Shonosuke Sugasawa ; Tatsuya Kubokawa
  8. Structural Break, Nonlinearity, and Asymmetry: A re-examination of PPP proposition By Omay, Tolga ; Hasanov, Mubariz ; Emirmahmutoglu, Furkan
  9. Semiparametric Dynamic Portfolio Choice with Multiple Conditioning Variables By Jia Chen ; Degui Li ; Oliver Linton ; Zudi Lu
  10. By Biørn, Erik
  11. A Consistent Variance Estimator for 2SLS When Instruments Identify Different LATEs By Seojeong Lee
  12. Fractional order statistic approximation for nonparametric conditional quantile inference By David M. Kaplan ; Matt Goldman
  13. Iteratively reweighted adaptive lasso for conditional heteroscedastic time series with applications to AR-ARCH type processes By Florian Ziel
  14. Can a data-rich environment help identify the sources of model misspecification? By Francesca Monti
  15. A simple procedure to estimate k structural parameters on conditionally endogenous variables with one conditionally mean independent instrument in linear models By Süß, Philipp
  16. Nonlinearity and Smooth Breaks in Unit Root Testing By Omay, Tolga ; Yildirim, Dilem
  17. An Overview of the Factor-augmented Error-Correction Model By Anindya Banerjee ; Massimiliano Marcellino ; Igor Masten
  18. Detecting weaks signals with record statistics By Damien Challet
  19. An Information Theoretic Criterion for Empirical Validation of Time Series Models By Stefano Lamperti
  20. Gini-PLS Regressions By Stéphane Mussard ; Fattouma Souissi-Benrejab
  21. AAPOR Report on Big Data By Task Force Members Include: Lilli Japec ; Frauke Kreuter ; Marcus Berg ; Paul Biemer ; Paul Decker ; Cliff Lampe ; Julia Lane ; Cathy O'Neil ; Abe Usher
  22. Efficient Perturbation Methods for Solving Regime-Switching DSGE Models By Junior Maih
  23. An Ordinal Pattern Approach to Detect and to Model Leverage Effects and Dependence Structures Between Financial Time Series By Alexander Schnurr

  1. By: Su Liangjun (Singapore Management University ); Tadao Hoshino (Waseda University )
    Abstract: In this paper, we consider sieve instrumental variable quantile regression (IVQR) estimation of functional coefficient models where the coefficients of endogenous regressors are unknown functions of some exogenous covariates. We approximate the unknown functional coefficients by some basis functions and estimate them by the IVQR technique. We establish the uniform consistency and asymptotic normality of the estimators of the functional coefficients. Based on the sieve estimates, we propose a nonparametric specification test for the constancy of the functional coefficients, study its asymptotic properties under the null hypothesis, a sequence of local alternatives and global alternatives, and propose a wild-bootstrap procedure to obtain the bootstrap p-values. A set of Monte Carlo simulations are conducted to evaluate the finite sample behavior of both the estimator and test statistic. As an empirical illustration of our theoretical results, we present the estimation of quantile Engel curves.
    Keywords: Endogeneity; Functional coefficient; Heterogeneity; Instrumental variable; Panel data; Sieve estimation; Specification test; Structural quantile function
    JEL: C12 C13 C14 C21 C23 C26
    Date: 2015–02
  2. By: Chudik, Alexander (Federal Reserve Bank of Dallas ); Mohaddes, Kamiar (Girton College ); Pesaran, M. Hashem (University of Southern Califormia ); Raissi, Mehdi (International Monetary Fund )
    Abstract: This paper develops a cross-sectionally augmented distributed lag (CS-DL) approach to the estimation of long-run effects in large dynamic heterogeneous panel data models with cross-sectionally dependent errors. The asymptotic distribution of the CS-DL estimator is derived under coefficient heterogeneity in the case where the time dimension (T) and the crosssection dimension (N) are both large. The CS-DL approach is compared with more standard panel data estimators that are based on autoregressive distributed lag (ARDL) specifications. It is shown that unlike the ARDL type estimator, the CS-DL estimator is robust to misspecification of dynamics and error serial correlation. The theoretical results are illustrated with small sample evidence obtained by means of Monte Carlo simulations, which suggest that the performance of the CS-DL approach is often superior to the alternative panel ARDL estimates particularly when T is not too large and lies in the range of 30≤T
    JEL: C23
    Date: 2015–01–01
  3. By: Kyriacou, Maria ; Phillips, Peter C.B. ; Rossi, Francesca
    Abstract: Ordinary least squares (OLS) is well-known to produce an inconsistent estimator of the spatial parameter in pure spatial autoregression (SAR). This paper explores the potential of indirect inference to correct the inconsistency of OLS. Under broad conditions, it is shown that indirect inference (II) based on OLS produces consistent and asymptotically normal estimates in pure SAR regression. The II estimator is robust to departures from normal disturbances and is computationally straightforward compared with pseudo Gaussian maximum likelihood (PML). Monte Carlo experiments based on various specifications of the weighting matrix confirm that the indirect inference estimator displays little bias even in very small samples and gives overall performance that is comparable to the Gaussian PML. <br><br> Keywords; bias, binding function, inconsistency, indirect inference, spatial autoregression
    Date: 2014–09–22
  4. By: Russel Davidson (Department of Economics and CIREQ McGill University ); Andrea Monticini (Dipartimento di Economia e Finanza, Università Cattolica del Sacro Cuore )
    Abstract: In many, if not most, econometric applications, it is impossible to estimate consistently the elements of the white-noise process or processes that underlie the DGP. A common example is a regression model with heteroskedastic and/or autocorrelated disturbances,where the heteroskedasticity and autocorrelation are of unknown form. A particular version of the wild bootstrap can be shown to work very well with many models, both univariate and multivariate, in the presence of heteroskedasticity. Nothing comparable appears to exist for handling serial correlation. Recently, there has been proposed something called the dependent wild bootstrap. Here, we extend this new method, and link it to the well-known HAC covariance estimator, in much the same way as one can link the wild bootstrap to the HCCME. It works very well even with sample sizes smaller than 50, and merits considerable further study.
    Keywords: Bootstrap, time series, wild bootstrap, dependent wild bootstrap,HAC covariance matrix estimator
    JEL: C12 C22 C32
    Date: 2014–03
  5. By: Arthur Charpentier (Université du Québec à Montréal CREM & GERAD ); Emmanuel Flachaire (Aix-Marseille University (Aix-Marseille School of Economics), CNRS & EHESS Institut Universitaire de France )
    Abstract: Standard kernel density estimation methods are very often used in practice to estimate density function. It works well in numerous cases. However, it is known not to work so well with skewed, multimodal and heavy-tailed distributions. Such features are usual with income distributions, defined over the positive support. In this paper, we show that a preliminary logarithmic transformation of the data, combined with standard kernel density estimation methods, can provide a much better fit of the density estimation.
    Keywords: nonparametric density estimation, heavy-tail, income distribution, data transformation, lognormal kernel
    JEL: C15
    Date: 2015–02
  6. By: Ahmad Farid Osman ; Maxwell L. King
    Abstract: In There is evidence that exponential smoothing methods as well as time varying parameter models perform relatively well in forecasting comparisons. The aim of this paper is to introduce a new forecasting technique by integrating the exponential smoothing model with regressors whose coefficients are time varying. In doing this, we construct an exponential smoothing model with regressors by extending Holt's linear exponential smoothing model. We then translate it into an equivalent state space structure so that the parameters can be estimated via the maximum likely-hood estimation procedure. Due to the potential problem in the updating equation for the regressor coefficients when the change in regressor is too small, we propose an alternative structure of the state space model which allows the updating process to be put on hold until sufficient information is available. An empirical study of forecast accuracy shows that the new model performs better than the existing exponential smoothing model as well as the linear regression model.
    Keywords: State space model, Single source of error, Time varying parameter, Time series, Forecast accuracy
    JEL: C51 C53
    Date: 2015
  7. By: Shonosuke Sugasawa (Graduate School of Economics, The University of Tokyo ); Tatsuya Kubokawa (Faculty of Economics, The University of Tokyo )
    Abstract: The Box-Cox transformation is applied to linear mixed models for analyzing positive and clustered data. The problem is that the maximum likelihood estimator of the transformation parameter is not consistent. To fix it, we suggest a simple and consistent estimator for the transformation parameter based on the moment method. The consistent estimator is used to construct consistent estimators of the parameters involved in the model and to provide an empirical predictor of a linear combination of both fixed and random effects. Second-order accurate prediction intervals for measuring uncertainty of the predictor are derived. Finally, the performance of the proposed procedure is investigated through simulation and empirical studies. --
    Date: 2015–02
  8. By: Omay, Tolga ; Hasanov, Mubariz ; Emirmahmutoglu, Furkan
    Abstract: In this study, we propose a new unit root test procedure that allows for both gradual structural break and asymmetric nonlinear adjustment towards the equilibrium level. Small-sample properties of the new test are examined through Monte-Carlo simulations. The simulation results suggest that the new test has satisfactory size and power properties. We then apply this new test along with other unit root tests to examine stationarity properties of real exchange rate series of the sample countries. Our test rejects the null of unit root in more cases when compared to alternative tests. Overall, we find that the PPP proposition holds in majority of the European countries examined in this paper.
    Keywords: Smooth Structural Break; Nonlinear Unit Root test; PPP
    JEL: C12 C22 F41
    Date: 2014–09–03
  9. By: Jia Chen ; Degui Li ; Oliver Linton ; Zudi Lu
    Abstract: Dynamic portfolio choice has been a central and essential objective for institutional investors in active asset management. In this paper, we study the dynamic portfolio choice with multiple conditioning variables, where the number of the conditioning variables can be either fixed or diverging to infinity at certain polynomial rate of the sample size. We propose a novel data-driven method to estimate the optimal portfolio choice, motivated by the model averaging marginal regression approach suggested by Li, Linton and Lu (2015). More specifically, in order to avoid the curse of dimensionality associated with multivariate nonparametric regression problem and to make it practically implementable, we first estimate the marginal optimal portfolio choice by maximising the conditional utility function for each univariate conditioning variable, and then construct the joint dynamic optimal portfolio through the weighted average of the marginal optimal portfolio across all the conditioning variables. Under some regularity conditions, we establish the large sample properties for the developed portfolio choice procedure. Both simulation studies and empirical application well demonstrate the performance of the proposed methodology.
    Keywords: Conditioning variables, kernel smoothing, model averaging, portfolio choice, utility function
    JEL: C13 C14 C32
    Date: 2015–02
  10. By: Biørn, Erik (Dept. of Economics, University of Oslo )
    Abstract: The measurement error problem in linear time series regression, with focus on the impact of error memory, modeled as finite-order MA processes, is considered. Three prototype models, two bivariate and one univariate ARMA, and ways of handling the problem by using instrumental variables (IVs) are discussed as examples. One has a bivariate regression equation that is static, although with dynamics, entering via the memory of its latent variables. The examples illustrate how 'structural dynamics' interacting with measurement error memory create bias in Ordinary Least Squares (OLS) and illustrate the potential of IV estimation procedures. Supplementary Monte Carlo simulations are provided for two of the example models.
    Keywords: Errors in variables; ARMA; Error memory; Simultaneity bias; Attenuation; Monte Carlo
    JEL: C22 C26 C32 C36 C52 C53
    Date: 2014–12–30
  11. By: Seojeong Lee (School of Economics, Australian School of Business, the University of New South Wales )
    Abstract: Under treatment effect heterogeneity, an instrument identifies the instrument-specific local average treatment effect (LATE). If a regression model is estimated by the two-stage least squares (2SLS) using multiple instruments, then 2SLS is consistent for a weighted average of different LATEs. In practice, a rejection of the overidentifying restrictions test can indicate that there are more than one LATE. What is often overlooked in the literature is that the postulated moment condition evaluated at the 2SLS estimand does not hold unless those LATEs are the same. If so, the conventional heteroskedasticity-robust variance estimator would be inconsistent. However, 2SLS standard errors based on the conventional variance estimator have been reported even when the overidentifying restrictions test is rejected. I propose a consistent estimator for the asymptotic variance of 2SLS by using the result of Hall and Inoue (2003) on misspecified moment condition models. This can be used to correctly calculate the standard errors regardless of whether there are more than one LATE or not.
    Keywords: local average treatment effect, treatment heterogeneity, two-stage least squares, variance estimator, model misspecification
    JEL: C13 C31 C36
    Date: 2015–01
  12. By: David M. Kaplan (Department of Economics, University of Missouri-Columbia ); Matt Goldman
    Abstract: Using and extending fractional order statistic theory, we characterize the O(n−1) coverage probability error of the previously proposed confidence intervals for population quantiles using L-statistics as endpoints in Hutson (1999). We derive an analytic expression for the n−1 term, which may be used to calibrate the nominal coverage level to get O(n−3/2log(n)) coverage error. Asymptotic power is shown to be optimal. Using kernel smoothing, we propose a related method for nonparametric inference on conditional quantiles. This new method compares favorably with asymptotic normaity and bootstrap methods in theory and in simulations. Code is provided for both unconditional and conditional inference.
    Keywords: fractional order statistics, high-order accuracy, inference-optimal band- width, kernel smoothing.
    JEL: C21
    Date: 2015–02–09
  13. By: Florian Ziel
    Abstract: Due to the increasing impact of big data, shrinkage algorithms are of great importance in almost every area of statistics, as well as in time series analysis. In current literature the focus is on lasso type algorithms for autoregressive time series models with homoscedastic residuals. In this paper we present an iteratively reweighted adaptive lasso algorithm for the estimation of time series models under conditional heteroscedasticity in a high-dimensional setting. We analyse the asymptotic behaviour of the resulting estimator that turns out to be significantly better than its homoscedastic counterpart. Moreover, we discuss a special case of this algorithm that is suitable to estimate multivariate AR-ARCH type models in a very fast fashion. Several model extensions like periodic AR-ARCH or ARMA-GARCH are discussed. Finally, we show different simulation results and an application to electricity price and load data.
    Date: 2015–02
  14. By: Francesca Monti (Bank of England ; Centre for Macroeconomics (CFM) )
    Abstract: This paper proposes a method for detecting the sources of misspecification in a DSGE model based on testing, in a data-rich environment, the exogeneity of the variables of the DSGE with respect to some auxiliary variables. Finding evidence of non-exogeneity implies misspecification, but finding that some specific variables help predict certain shocks can shed light on the dimensions along which the model is misspecified. Forecast error variance decomposition analysis then helps assess the relevance of the missing channels. The paper puts the proposed methodology to work both in a controlled experiment - by running a Monte Carlo simulations with a known DGP - and using a state-of-the-art model and US data up to 2011.
    Keywords: DSGE Models, Model Misspecification, Bayesian Analysis
    JEL: C32 C52
    Date: 2015–01
  15. By: Süß, Philipp
    Abstract: The following note proposes a simple procedure to estimate k parameters of interest in a linear model with potentially k conditionally endogenous variables of interest and m endogenous control variables in the presence of at least one instrumental variable under the assumption of conditional mean independence.
    Keywords: Instrumental variables; Conditional independence assumption; Underidentified model
    JEL: C26
    Date: 2015–02–10
  16. By: Omay, Tolga ; Yildirim, Dilem
    Abstract: We develop unit root tests that allow under the alternative hypothesis for a smooth transition between deterministic linear trends, around which stationary asymmetric adjustment may occur by employing exponential smooth transition auto-regressive (ESTAR) models The small sample properties of the newly developed test are briefly investigated and an application for investigating the PPP hypothesis for Argentina is provided.
    Keywords: Smooth Break; Nonlinear Unit Root Test; PPP
    JEL: C12 C22 F4
    Date: 2013–05–10
  17. By: Anindya Banerjee ; Massimiliano Marcellino ; Igor Masten
    Abstract: The Factor-augmented Error Correction Model (FECM) generalizes the factor-augmented VAR (FAVAR) and the Error Correction Model (ECM), combining error-correction, cointegration and dynamic factor models. It uses a larger set of variables compared to the ECM and incorporates the long-run information lacking from the FAVAR because of the latter's specification in differences. In this paper we review the specification and estimation of the FECM, and illustrate its use for forecasting and structural analysis by means of empirical applications based on Euro Area and US data.
    Keywords: Dynamic Factor Models, Cointegration, Structural Analysis, Factor-augmented Error Correction Models, FAVAR
    JEL: C32 E17
    Date: 2015–01
  18. By: Damien Challet
    Abstract: Counting the number of local extrema of the cumulative sum of data points yields the R-test, a new single-sample non-parametric test. Numeric simulations indicate that the R-test is more powerful than Student's t-test for semi-heavy and heavy-tailed distributions, equivalent for Gaussian variables and slightly worse for uniformly distributed variables. Finally the R-test has a greater power than the single-sample Wilcoxon signed-rank test in most cases.
    Date: 2015–02
  19. By: Stefano Lamperti
    Abstract: Simulated models suffer intrinsically from validation and comparison problems. The choice of a suitable indicator quantifying the distance between the model and the data is pivotal to model selection. However, how to validate and discriminate between alternative models is still an open problem calling for further investigation, especially in light of the increasing use of simulations in social sciences. In this paper, we present an information theoretic criterion to measure how close models' synthetic output replicates the properties of observable time series without the need to resort to any likelihood function or to impose stationarity requirements. The indicator is sufficiently general to be applied to any kind of model able to simulate or predict time series data, from simple univariate models such as Auto Regressive Moving Average (ARMA) and Markov processes to more complex objects including agent-based or dynamic stochastic general equilibrium models. More specifically, we use a simple function of the L-divergence computed at different block lengths in order to select the model that is better able to reproduce the distributions of time changes in the data. To evaluate the L-divergence, probabilities are estimated across frequencies including a correction for the systematic bias. Finally, using a known data generating process, we show how this indicator can be used to validate and discriminate between different models providing a precise measure of the distance between each of them and the data.
    Keywords: Simulations, Empirical Validation, Time Series, Agent Based Models
    Date: 2015–02–24
  20. By: Stéphane Mussard ; Fattouma Souissi-Benrejab
    Abstract: Data contamination and excessive correlations between regressors (multicollinearity) constitute a standard and major problem in econometrics. Two techniques enable solving these problems, in separate ways: the Gini regression for the former, and the PLS (partial least squares) regression for the latter. Gini-PLS regressions are proposed in order to treat extreme values and multicollinearity simultaneously.
    Date: 2015–02
  21. By: Task Force Members Include: Lilli Japec ; Frauke Kreuter ; Marcus Berg ; Paul Biemer ; Paul Decker ; Cliff Lampe ; Julia Lane ; Cathy O'Neil ; Abe Usher
    Abstract: In recent years we have seen an increase in the amount of statistics in society describing different phenomena based on so called Big Data. The term Big Data is used for a variety of data as explained in the report, many of them characterized not just by their large volume, but also by their variety and velocity, the organic way in which they are created, and the new types of processes needed to analyze them and make inference from them. The change in the nature of the new types of data, their availability, the way in which they are collected, and disseminated are fundamental. The change constitutes a paradigm shift for survey research.
    Keywords: AAPOR, Big Data
    JEL: C
    Date: 2015–02–12
  22. By: Junior Maih
    Abstract: In an environment where economic structures break, variances change, distributions shift, conventional policies weaken and past events tend to reoccur, economic agents have to form expectations over different regimes. This makes the regime-switching dynamic stochastic general equilibrium (RS-DSGE) model the natural framework for analyzing the dynamics of macroeconomic variables. We present efficient solution methods for solving this class of models, allowing for the transition probabilities to be endogenous and for agents to react to anticipated events. The solution algorithms derived use a perturbation strategy which, unlike what has been proposed in the literature, does not rely on the partitioning of the switching parameters. These algorithms are all implemented in RISE, a flexible object-oriented toolbox that can easily integrate alternative solution methods. We show that our algorithms replicate various examples found in the literature. Among those is a switching RBC model for which we present a third-order perturbation solution.
    Keywords: DSGE, Markov switching, Sylvester equation, Newton algorithm, perturbation, matrix polynomial
    Date: 2014–12
  23. By: Alexander Schnurr
    Abstract: We introduce two types of ordinal pattern dependence between time series. Positive (resp. negative) ordinal pattern dependence can be seen as a non-paramatric and in particular non-linear counterpart to positive (resp. negative) correlation. We show in an explorative study that both types of this dependence show up in real world financial data.
    Date: 2015–01

This nep-ecm issue is ©2015 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.