nep-ecm New Economics Papers
on Econometrics
Issue of 2015‒02‒22
24 papers chosen by
Sune Karlsson
Örebro universitet

  1. Single-Step Estimation of a Partially Linear Model By Daniel J. Henderson; Christopher F. Parmeter
  2.  l1 Regressions: Gini Estimators for Fixed Effects Panel Data By Ndene Ka; Stephane Mussard
  3. "Linear Shrinkage Estimation of Large Covariance Matrices with Use of Factor Models" By Yuki Ikeda; Tatsuya Kubokawa
  4. Accuracy and efficiency of various GMM inference techniques in dynamic micro panel data models By Jan F. Kiviet; Milan Pleus; Rutger Poldermans
  5. Bivariate GARCH models for single asset returns By Tomasz Skoczylas
  6. Testing for Panel Cointegration using Common Correlated Effects Estimators By Anindya Banerjee; Josep Lluis Carrion-i-Silvestre
  7. Monitoring Stationarity and Cointegration By Wagner, Martin; Wied, Dominik
  8. Nonparametric change-point analysis of volatility By Markus Bibinger; Moritz Jirak; Mathias Vetter;
  9. Dynamic Vector Mode Regression By Gordon C R Kemp; Paulo M D C Parente; J M C Santos Silva
  10. Factor based identification-robust inference in IV regressions By Kapetanios, George; Khalaf, Lynda; Marcellino, Massimiliano
  11. Testing Uniformity on High-Dimensional Spheres against Contiguous Rotationally Symmetric Alternatives By Christine Cutting; Davy Paindaveine; Thomas Verdebout
  12. Shrinkage Estimation of Dynamic Panel Data Models with Interactive Fixed Effects By Xun Lu; Su Liangjun
  13. Demand Estimation with Machine Learning and Model Combination By Patrick Bajari; Denis Nekipelov; Stephen P. Ryan; Miaoyu Yang
  14. A New Class of Bivariate Threshold Cointegration Models By Biqing Cai; Jiti Gao; Dag Tjøstheim
  15. On Flexible Linear Factor Stochastic Volatility Models By Malefaki, Valia
  16. Local Directional Moran Scatter Plot - LDMS. By Davide Fiaschi; Lisa Gianmoena; Angela Parenti
  17. Nonparametric inference on conditional quantile treatment effects using L-statistics By David M. Kaplan; Matt Goldman
  18. Quasi-Newton particle Metropolis-Hastings applied to intractable likelihood models By Johan Dahlin; Fredrik Lindsten; Thomas B. Sch\"on
  19. Modeling corporate defaults: Poisson autoregressions with exogenous covariates (PARX) By Arianna Agosto; Giuseppe Cavaliere; Dennis Kristensen; Anders Rahbek
  20. Non Parametric Estimates of Option Prices Using Superhedging By Gianluca Cassese
  21. MIDAS regressions with time-varying parameters: An application to corporate bond spreads and GDP in the Euro area By Schumacher, Christian
  22. Identification of the Timing-of-Events Model with Multiple Competing Exit Risks from Single-Spell Data By Drepper, Bettina; Effraimidis, Georgios
  23. Inferring the predictability induced by a persistent regressor in a predictive threshold model By Gonzalo, Jesus; Pitarakis, Jean-Yves
  24. Statistical Methods for Distributional Analysis By Franck A. Cowell; Emmanuel Flachaire

  1. By: Daniel J. Henderson (Department of Economics, Finance and Legal Studies, University of Alabama); Christopher F. Parmeter (Department of Economics, University of Miami)
    Abstract: In this paper we propose an asymptotically equivalent single-step alternative to the two-step partially linear model estimator in Robinson (1988). The estimator not only has the potential to decrease computing time dramatically, it shows substantial finite sample gains in Monte Carlo simulations.
    Keywords: Cross-validation, bandwidth, bias, Monte Carlo, Kernel Publication Status: Under Review
    JEL: C01 C14 C21
    Date: 2015–01–01
  2. By: Ndene Ka (LAMETA, Université Montpellier I); Stephane Mussard (LAMETA, Université Montpellier I; GREDI, Université de Sherbrooke; CEPS Luxembourg)
    Abstract:  Panel data, frequently employed in empirical investigations, provide estimators being strongly biased in the presence of atypical observations. The aim of this work is to propose a l1 Gini regression for panel data. It is shown that the fixed effects within-group Gini estimator is more robust than the OLS one when the data are contaminated by outliers. This semi-parametric Gini estimator is proven to be an U-statistics, consequently, it is asymptotically normal.
    Keywords: Gini, Panel, Regression, U-statistics.
    Date: 2015–02
  3. By: Yuki Ikeda (Graduate School of Economics, The University of Tokyo); Tatsuya Kubokawa (Faculty of Economics, The University of Tokyo)
    Abstract: The problem of estimating large covariance matrices with use of factor models is addressed when both the sample size and the dimension of covariance matrix tend to innity. In this paper, we consider a general class of weighted estimators which includes (i) linear combinations of the sample covariance matrix and the model-based estimator under the factor model and (ii) ridge-type estimators without factors as special cases. The optimal weights in the class are derived, and the plug-in weighted estimators are suggested since the optimal weights depend on unknown parameters. Numerical results show our methods perform well. Finally, an application to portfolio managements is given. --
    Date: 2015–02
  4. By: Jan F. Kiviet (Division of Economics, School of Humanities and Social Sciences, Nanyang Technological University, 14 Nanyang Drive, Singapore 637332;); Milan Pleus (Amsterdam School of Economics & Tinbergen Institute, University of Amsterdam, Valckenierstraat 65, 1018 XE Amsterdam, The Netherlands); Rutger Poldermans (Amsterdam School of Economics & Tinbergen Institute, University of Amsterdam, Valckenierstraat 65, 1018 XE Amsterdam, The Netherlands)
    Abstract: The performance in finite samples is examined of inference obtained by variants of the Arellano-Bond and the Blundell-Bond GMM estimation techniques for single dynamic panel data models with possibly endogenous regressors and cross-sectional heteroskedasticity. By simulation the effects are examined of using particular instrument strength enhancing reductions and transformations of the matrix of instrumental variables, of less robust implementations of the GMM weighting matrix, and also of corrections to the standard asymptotic variance estimates. We compare the root mean squared errors of the coefficient estimators and also the size of tests on coefficient values and of different implementations of overidentification restriction tests. Also the size and power of tests on the validity of the additional orthogonality conditions exploited by the Blundell-Bond technique are assessed over a pretty wide grid of relevant cases. Surprisingly, particular asymptotically optimal and relatively robust weighting matrices are found to be superior in finite samples to ostensibly more appropriate versions. Most of the variants of tests for overidentification restrictions show serious deficiencies. A recently developed modification of GMM is found to have great potential when the cross-sectional heteroskedasticity is pronounced and the time-series dimension of the sample not too small. Finally all techniques are employed to actual data and lead to some profound insights.
    Keywords: cross-sectional heteroskedasticity, Sargan-Hansen (incremental) tests, variants of t-tests, weighting matrices, Windmeijer-correction
    JEL: C12 C13 C15 C23 C26 C52
    Date: 2014–12
  5. By: Tomasz Skoczylas (Faculty of Economic Sciences, University of Warsaw)
    Abstract: In this paper an alternative approach to modelling and forecasting single asset returns volatility is presented. A new, bivariate, flexible framework, which may be considered as a development of single-equation ARCH-type models, is proposed. This approach focuses on joint distribution of returns and observed volatility, measured by Garman-Klass variance estimator, and it enables to examine simultaneous dependencies between them. Proposed models are compared with benchmark GARCH and range-based GARCH (RGARCH) models in terms of prediction accuracy. All models are estimated with maximum likelihood method, using time series of EUR/PLN spot rate quotations and WIG20 index. Results are very encouraging especially for foreasting Value-at-Risk. Bivariate models achieved lesser rates of VaR exception, as well as lower coverage tests statistics, without being more conservative than its single-equation counterparts, as their forecasts errors measures are rather similar.
    Keywords: bivariate volatility models, joint distribution, range-based volatility estimators, Garman-Klass estimator, observed volatility, volatility modelling, GARCH, leverage, Value-at-Risk, volatility forecasting
    JEL: C13 C32 C53 C58 G10 G17
    Date: 2015
  6. By: Anindya Banerjee; Josep Lluis Carrion-i-Silvestre
    Abstract: Spurious regression analysis in panel data when the time series are cross-section dependent is analyzed in the paper. We show that consistent estimation of the long-run average parameter is possible once we control for cross-section dependence using cross-section averages in the spirit of the common correlated effects approach in Pesaran (2006). This result is used to design a panel cointegration test statistic accounting for cross-section dependence. The performance of the proposal is investigated in comparison with factor-based methods to control for cross-section dependence when strong, semi-weak and weak cross-section dependence may be present.
    Keywords: panel cointegration, cross-section dependence, common factors, spatial econometrics
    JEL: C12 C22
    Date: 2014–12
  7. By: Wagner, Martin; Wied, Dominik
    Abstract: We propose a monitoring procedure to detect a structural change from stationary to integrated behavior. When the procedure is applied to the errors of a relationship between integrated series it thus monitors a structural change from a cointegrating relationship to a spurious regression. The cointegration monitoring procedure is based on residuals from modified least squares estimation, using either Fully Modified, Dynamic or Integrated Modified OLS. The procedure is inspired by Chu et al. (1996) in that it is based on parameter estimation only on a pre-break ``calibration'' period rather than being based on sequential estimation over the full sample. We investigate the asymptotic behavior of the procedures under the null, for (fixed and local) alternatives and in case of parameter changes. We also study the finite sample performance via simulations. An application to credit default swap spreads illustrates the potential usefulness of the procedure.
    JEL: C32 C22 C52
    Date: 2014
  8. By: Markus Bibinger; Moritz Jirak; Mathias Vetter;
    Abstract: This work develops change-point methods for statistics of high-frequency data. The main interest is the volatility of an Itˆo semi-martingale, which is discretely observed over a fixed time horizon. We construct a minimax-optimal test to discriminate different smoothness classes of the underlying stochastic volatility process. In a high-frequency framework we prove weak convergence of the test statistic under the hypothesis to an extreme value distribution. As a key example, under extremely mild smoothness assumptions on the stochastic volatility we thereby derive a consistent test for volatility jumps. A simulation study demonstrates the practical value in finite-sample applications.
    Keywords: high-frequency data, nonparametric change-point test, minimax-optimal test, stochastic volatility, volatility jumps
    JEL: C12 C14
    Date: 2015–02
  9. By: Gordon C R Kemp; Paulo M D C Parente; J M C Santos Silva
    Abstract: We study the semi-parametric estimation of the conditional mode of a random vector that has a continuous conditional joint density with a well-defined global mode. A novel full-system estimator is proposed and its asymptotic properties are studied allowing for possibly dependent data. We specifically consider the estimation of vector autoregressive conditional mode models and of structural systems of linear simultaneous equations definded by mode restrictions. The proposed estimator is easy to implement using standard software and the results of a small simulation study suggest that it is well behaved in finite samples.
    Date: 2015–02–09
  10. By: Kapetanios, George; Khalaf, Lynda; Marcellino, Massimiliano
    Abstract: Robust methods for IV inference have received considerable attention recently. Their analysis has raised a variety of problematic issues such as size/power trade-offs resulting from weak or many instruments. We show that information-reduction methods provide a useful and practical solution to this and related problems. Formally, we propose factor-based modifications to three popular weak-instrument-robust statistics, and illustrate their validity asymptotically and in finite samples. Results are derived using asymptotic settings that are commonly used in both the factor and weak instrument literatures. For the Anderson-Rubin statistic, we also provide analytical finite sample results that do not require any underlying factor structure. An illustrative Monte Carlo study reveals the following. Factor based tests control size regardless of instruments and factor quality. All factor based tests are systematically more powerful than standard counterparts. With informative instruments and in contrast with standard tests: (i) power of factor-based tests is not affected by k even when large, and (ii) weak factor structure does not cost power. An empirical study on a New Keynesian macroeconomic model suggests that our factor-based methods can bridge a number of gaps between structural and statistical modeling.
    Keywords: factor model; identification-robust inference; IV regression; new Keynesian model; principle components; weak instruments
    Date: 2015–02
  11. By: Christine Cutting; Davy Paindaveine; Thomas Verdebout
    Keywords: contiguity; high-dimensional statistics; local and asymptotic normality; rotationally symmetric distributions; spherical statistics; tests of uniformity
    Date: 2014–02
  12. By: Xun Lu (HKUST); Su Liangjun (Singapore Management University)
    Abstract: We consider the problem of determining the number of factors and selecting the proper regressors in linear dynamic panel data models with interactive fixed effects. Based on the preliminary estimates of the slope parameters and factors a la Bai and Ng (2009) and Moon and Weidner (2014a), we propose a method for simultaneous selection of regressors and factors and estimation through the method of adaptive group Lasso (least absolute shrinkage and selection operator). We show that with probability approaching one, our method can correctly select all relevant regressors and factors and shrink the coefficients of irrelevant regressors and redundant factors to zero. Further, we demonstrate that our shrinkage estimators of the nonzero slope parameters exhibit some oracle property. We conduct Monte Carlo simulations to demonstrate the superb finite-sample performance of the proposed method. We apply our method to study the determinants of economic growth and find that in addition to three common unobserved factors selected by our method, government consumption share has negative effects, whereas investment share and lagged economic growth have positive effects on economic growth.
    Keywords: Adaptive Lasso; Dynamic panel; Factor selection; Group Lasso; Interactive fixed effects; Oracle property; Selection consistency
    JEL: C13 C23 C51
    Date: 2015–02
  13. By: Patrick Bajari; Denis Nekipelov; Stephen P. Ryan; Miaoyu Yang
    Abstract: We survey and apply several techniques from the statistical and computer science literature to the problem of demand estimation. We derive novel asymptotic properties for several of these models. To improve out-of-sample prediction accuracy and obtain parametric rates of convergence, we propose a method of combining the underlying models via linear regression. Our method has several appealing features: it is robust to a large number of potentially-collinear regressors; it scales easily to very large data sets; the machine learning methods combine model selection and estimation; and the method can flexibly approximate arbitrary non-linear functions, even when the set of regressors is high dimensional and we also allow for fixed effects. We illustrate our method using a standard scanner panel data set to estimate promotional lift and find that our estimates are considerably more accurate in out of sample predictions of demand than some commonly used alternatives. While demand estimation is our motivating application, these methods are likely to be useful in other microeconometric problems.
    JEL: C14 C53
    Date: 2015–02
  14. By: Biqing Cai; Jiti Gao; Dag Tjøstheim
    Keywords: β-null recurrent, cointegration, Markov chain, threshold VAR models <i>T</i><sup>1/2</sup>, while the convergence rate for the estimators for the coefficients in the middle regime is <i>T</i>. Also, we show that the convergence rate of the cointegrating coefficient is <i>T</i><sup>1/2</sup>, which is same as linear cointegration model. The Monte Carlo simulation results suggest that the estimators perform reasonably well in finite samples. Applying the proposed model to study the dynamic relationship between Federal funds rate and 3-month Treasury bill rate, we find that cointegrating coefficients are the same for the two regimes while the short run loading coefficients are different.
    JEL: C11 C58 G01
    Date: 2015
  15. By: Malefaki, Valia
    Abstract: In this thesis I discuss flexible Bayesian treatment of the linear factor stochastic volatility model with latent factors, which proves to be essential in order to preserve parsimony when the number of cross section in the data grows. Based on the Bayesian model selection literature, I introduce a flexible prior specification which allows carrying out restriction search on the mean equation coefficients of the factor model – the loadings matrix. I use this restriction search as a data-based alternative to evaluate the cross sectional restrictions suggested by arbitrage pricing theory. A mixture innovation model is also proposed which generalizes the standard stochastic volatility specification and can also be interpreted as a restriction search in variance equation parameters. I comment on how to use the mixture innovation model to catch both gradual and abrupt changes in the stochastic evolution of the covariance matrix of high-dimensional financial datasets. This approach has the additional advantages of dating when large jumps in volatility have occurred in the data and determining whether these jumps are attributed to any of the factors, the innovation errors, or combinations of those.
    Keywords: Factor model; Bayesian prior
    JEL: C01 C11 G11 G12
    Date: 2015–01
  16. By: Davide Fiaschi; Lisa Gianmoena; Angela Parenti
    Abstract: This paper propose a novel methodology to estimate the distribution dynamics of income in presence of spatial dependence by representing spatial dynamics as a random vector field in Moran space. Inference on the local spatial dynamics is discussed, including a test on the presence of local spatial dependence. The methodology also allows to compute a forecast of future income distribution which includes also the effects of spatial dependence. An application to US States is used to illustrate the effective capacities of the methodology.
    Keywords: Exploratory data analysis, polarization, random vector field, spatial dynamics, spatial dependence, distribution dynamics, US States.
    JEL: C14 O51 R11
    Date: 2015–02–01
  17. By: David M. Kaplan (Department of Economics, University of Missouri-Columbia); Matt Goldman
    Abstract: We provide novel methods for inference on quantile treatment effects in both uncon- ditional and conditional (nonparametric) settings. These methods achieve high-order accuracy by using the probability integral transform and a Dirichlet (rather than Gaus- sian) reference distribution. We propose related methods for joint inference on multiple quantiles and inference on linear combinations of quantiles, again in both unconditional and conditional settings. Optimal bandwidth and coverage probability rates are derived for all methods, and code is provided.
    Keywords: Dirichlet distribution, fractional order statistics, high-order accuracy.
    JEL: C21
    Date: 2014–09–29
  18. By: Johan Dahlin; Fredrik Lindsten; Thomas B. Sch\"on
    Abstract: Particle Metropolis-Hastings enables Bayesian parameter inference in general nonlinear state space models (SSMs). However, in many implementations a random walk proposal is used and this can result in poor mixing if not tuned correctly using tedious pilot runs. Therefore, we consider a new proposal inspired by quasi-Newton algorithms that achieves better mixing with less tuning. Compared to other Hessian based proposals, it only requires estimates of the gradient of the log-posterior. A possible application of this new proposal is parameter inference in the challenging class of SSMs with intractable likelihoods. We exemplify this application and the benefits of the new proposal by modelling log-returns of future contracts on coffee by a stochastic volatility model with symmetric $\alpha$-stable observations.
    Date: 2015–02
  19. By: Arianna Agosto (Gruppo Bancario Credito Valtellinese); Giuseppe Cavaliere (Department of Statistical Sciences, University of Bologna); Dennis Kristensen (Department of Economics, University College London, Institute of Fiscal Studies, and CREATES); Anders Rahbek (University of Copenhagen and CREATES)
    Abstract: We develop a class of Poisson autoregressive models with additional covariates (PARX) that can be used to model and forecast time series of counts. We establish the time series properties of the models, including conditions for stationarity and existence of moments. These results are in turn used in the analysis of the asympotic properties of the maximum-likelihood estimators of the models. The PARX class of models is used to analyse the time series properties of monthly corporate defaults in the US in the period 1982-2011 using financial and economic variables as exogeneous covariates. Results show that our model is able to capture the time series dynamics of corporate defaults well, including the well-known default counts clustering found in data. Moreover, we find that while in general current defaults do indeed affect the probability of other firms defaulting in the future, in recent years economic and financial factors at the macro level are capable to explain a large portion of the correlation of US firms defaults over time.
    Keywords: corporate defaults, count data, exogeneous covariates, Poisson autoregression, estimation
    JEL: C13 C22 C25 G33
    Date: 2015–01–28
  20. By: Gianluca Cassese
    Abstract: We propose a new non parametric technique to estimate the CALL function based on the superhedging principle. Our approach does not require absence of arbitrage and easily accommodates bid/ask spreads and other market imperfections. We prove some optimal statistical properties of our estimates. As an application we first test the methodology on a simulated sample of option prices and then on the S\&P 500 index options.
    Date: 2015–02
  21. By: Schumacher, Christian
    Abstract: Mixed-data sampling (MIDAS) regressions allow to estimate dynamic equations that explain a low-frequency variable by high-frequency variables and their lags. To account for temporal instabilities in this relationship, this paper discusses an extension to MIDAS with time-varying parameters, which follow random-walk processes. The non-linear functional forms in the MIDAS regression necessitate the use of non-linear ltering techniques. In this paper, the Particle Fi lter is used to estimate the time-varying parameters in the model. Simulations with time-varying DGPs help to assess the properties of the estimation approach. A real-time application to the relationship between daily corporate bond spreads and quarterly GDP growth in the Euro area shows that the leading indicator property of the spreads ahead of GDP has diminished during the recent crisis. During that period, corporate bond spreads rather seem to be coincident indicators of GDP growth.
    JEL: C51 C53 E37
    Date: 2014
  22. By: Drepper, Bettina (Tilburg University); Effraimidis, Georgios (University of Southern Denmark)
    Abstract: This note describes how the (single-spell) identification result of the timing-of-events model by Abbring and Van den Berg (2003b) can be extended to a model that accommodates several competing exit risks. The extended model can be used for example to distinguish between the different effects of a benefit sanction on several competing exit risks out of unemployment such as 'finding work' vs. 'exiting the labor force'. By allowing for a flexible dependence structure between competing exit risks and the duration until entry into treatment, the model can take account of selection into treatment and dependencies between competing exit risks by way of unobservables.
    Keywords: competing risks, treatment effects, multivariate duration analysis, mixed proportional hazard, timing-of-events
    JEL: C41 C31 J64
    Date: 2015–02
  23. By: Gonzalo, Jesus; Pitarakis, Jean-Yves
    Abstract: We develop tests for detecting possibly episodic predictability induced by a persistent predictor. Our framework is that of a predictive regression model with threshold effects and our goal is to develop operational and easily implementable inferences when one does not wish to impose a priori restrictions on the parameters of the model other than the slopes corresponding to the persistent predictor. Differently put our tests for the null hypothesis of no predictability against threshold predictability remain valid without the need to know whether the remaining parameters of the model are characterised by threshold effects or not (e.g. shifting versus non-shifting intercepts). One interesting feature of our setting is that our test statistics remain unaffected by whether some nuisance parameters are identifed or not. We subsequently apply our methodology to the predictability of aggregate stock returns with valuation ratios and document a robust countercyclicality in the ability of some valuation ratios to predict returns in addition to highlighting a strong sensitivity of predictability based results to the time period under consideration. <br><br> Keywords; predictive regressions, threshold effects, predictability of stock return
    Date: 2015–01–01
  24. By: Franck A. Cowell (STICERD London School of Economics; Université du Québec à Montréal CREM & GERAD); Emmanuel Flachaire (Aix-Marseille University (Aix-Marseille School of Economics), CNRS & EHESS Institut Universitaire de France)
    Abstract: This Chapter is about the techniques, formal and informal, that are commonly used to give quantitative answers in the field of distributional analysis - covering subjects including inequality, poverty and the modelling of income distributions. It deals with parametric and non-parametric approaches and the way in which imperfections in data may be handled in practice.
    Keywords: goodness of fit, parametric modelling, non-parametric methods, dominance criteria, welfare indices, inequality measure, poverty measure, influence function, hypothesis testing, confidence intervals, bootstrap
    JEL: D31 D63 C10
    Date: 2015–02

This nep-ecm issue is ©2015 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.