nep-ecm New Economics Papers
on Econometrics
Issue of 2013‒03‒30
eleven papers chosen by
Sune Karlsson
Orebro University

  1. Pivotal uniform inference in high-dimensional regression with random design in wide classes of models via linear programming By Eric Gautier; Alexandre Tsybakov
  2. Simple, Asymptotically Distribution-Free, Optimal Tests for Circular Reflective Symmetry about a Known Median Direction By Christophe Ley; Thomas Verdebout
  3. Estimating Alternative Technology Sets in Nonparametric Efficiency Analysis: Restriction Tests for Panel and Clustered Data By Anne Neumann; Maria Nieswand; Torben Schubert
  4. Ten Things You Should Know About DCC By Massimiliano Caporin; Michael McAleer
  5. Volatility Inference in the Presence of Both Endogenous Time and Microstructure Noise By Yingying Li; Zhiyuan Zhang; Xinghua Zheng
  6. Bias in the Mean Reversion Estimator in Continuous-Time Gaussian and Lévy Processes By Yong Bao; Aman Ullah; Yun Wang; Jun Yu
  7. Asymptotically UMP Panel Unit Root Tests By Becheri, I.G.; Drost, F.C.; Akker, R. van den
  8. Iteration Capping For Discrete Choice Models Using the EM Algorithm By Kabatek, J.
  9. Factor Models in High-Dimensional Time Series: A Time-Domain Approach By Marc Hallin; Marco Lippi
  10. Prediction Bias Correction for Dynamic Term Structure Models By Eran Raviv
  11. A Noncausal Autoregressive Model with Time-Varying Parameters: An Application to U.S. Inflation By Markku Lanne; Jani Luoto

  1. By: Eric Gautier (CREST, ENSAE); Alexandre Tsybakov (CREST, ENSAE)
    Abstract: We propose a new method of estimation in high-dimensional linear regression model. It allows for very weak distributional assumptions including heteroscedasticity, and does not require the knowledge of the variance of random errors. The method is based on linear programming only, so that its numerical implementation is faster than for previously known techniques using conic programs, and it allows one to deal with higher dimensional models. We provide upper bounds for estimation and prediction errors of the proposed estimator showing that it achieves the same rate as in the more restrictive situation of fixed design and i.i.d. Gaussian errors with known variance. Following Gautier and Tsybakov (2011), we obtain the results under weaker sensitivity assumptions than the restricted eigenvalue or assimilated conditions.
    Date: 2013–03
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1303.7092&r=ecm
  2. By: Christophe Ley; Thomas Verdebout
    Keywords: circular statistics; fisher information singularity; Rayleigh test for uniformity; skewed distributions; tests for symmetry
    Date: 2013–03
    URL: http://d.repec.org/n?u=RePEc:eca:wpaper:2013/142644&r=ecm
  3. By: Anne Neumann; Maria Nieswand; Torben Schubert
    Abstract: Nonparametric efficiency analysis has become a widely applied technique to support industrial benchmarking as well as a variety of incentive- based regulation policies. In practice such exercises are often plagued by incomplete knowledge about the correct specifications of inputs and outputs. Simar and Wilson (2001) and Schubert and Simar (2011) propose restriction tests to support such specification decisions for cross-section data. However, the typical oligopolized market structure pertinent to regulation contexts often leads to low numbers of cross-section observations, rendering reliable estimation based on these tests practically unfeasible. This small-sample problem could often be avoided with the use of panel data, which would in any case require an extension of the cross-section restriction tests to handle panel data. In this paper we derive these tests. We prove the consistency of the proposed method and apply it to a sample of US natural gas transmission companies in 2003 through 2007. We find that the total quantity of gas delivered and gas delivered in peak periods measure essentially the same output. Therefore only one needs to be included. We also show that the length of mains as a measure of transportation service is non-redundant and therefore must be included.
    Keywords: Benchmarking models, network industries, nonparametric efficiency estimation, data envelopment analysis, testing restrictions, subsampling, Bootstrap
    JEL: C14 L51 L95
    Date: 2013
    URL: http://d.repec.org/n?u=RePEc:diw:diwwpp:dp1283&r=ecm
  4. By: Massimiliano Caporin; Michael McAleer (University of Canterbury)
    Abstract: The purpose of the paper is to discuss ten things potential users should know about the limits of the Dynamic Conditional Correlation (DCC) representation for estimating and forecasting time-varying conditional correlations. The reasons given for caution about the use of DCC include the following: DCC represents the dynamic conditional covariances of the standardized residuals, and hence does not yield dynamic conditional correlations; DCC is stated rather than derived; DCC has no moments; DCC does not have testable regularity conditions; DCC yields inconsistent two step estimators; DCC has no asymptotic properties; DCC is not a special case of GARCC, which has testable regularity conditions and standard asymptotic properties; DCC is not dynamic empirically as the effect of news is typically extremely small; DCC cannot be distinguished empirically from diagonal BEKK in small systems; and DCC may be a useful filter or a diagnostic check, but it is not a model.
    Keywords: DCC; BEKK; GARCC; Stated representation; Derived model; Conditional covariances; Conditional correlations; Regularity conditions; Moments; Two step estimators; Assumed properties; Asymptotic properties; Filter; Diagnostic check
    JEL: C18 C32 C58 G17
    Date: 2013–03–19
    URL: http://d.repec.org/n?u=RePEc:cbt:econwp:13/16&r=ecm
  5. By: Yingying Li; Zhiyuan Zhang; Xinghua Zheng
    Abstract: In this article we consider the volatility inference in the presence of both market microstructure noise and endogenous time. Estimators of the integrated volatility in such a setting are proposed, and their asymptotic properties are studied. Our proposed estimator is compared with the existing popular volatility estimators via numerical studies. The results show that our estimator can have substantially better performance when time endogeneity exists.
    Date: 2013–03
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1303.5809&r=ecm
  6. By: Yong Bao (Department of Economics, Purdue University); Aman Ullah (Department of Economics, University of California,); Yun Wang (School of International Trade and Economics, University of International Business and Economics); Jun Yu (Sim Kee Boon Institute for Financial Economics, School of Economics and Lee Kong Chian School of Business, Singapore Management University)
    Abstract: This paper develops the approximate finite-sample bias of the ordinary least squares or quasi max- imum likelihood estimator of the mean reversion parameter in continuous-time Levy processes. For the special case of Gaussian processes, our results reduce to those of Tang and Chen (2009) (when the long-run mean is unknown) and Yu (2012) (when the long-run mean is known). Simulations show that in general the approximate bias works well in capturing the true bias of the mean reversion estimator under difference scenarios. However, when the time span is small and the mean reversion parameter is approaching its lower bound, we nd it more difficult to approximate well the finite-sample bias.
    JEL: C10 C22
    Date: 2013–03
    URL: http://d.repec.org/n?u=RePEc:siu:wpaper:02-2013&r=ecm
  7. By: Becheri, I.G.; Drost, F.C.; Akker, R. van den (Tilburg University, Center for Economic Research)
    Abstract: Abstract This paper considers optimal unit root tests for a Gaussian cross-sectionally independent heterogeneous panel with incidental intercepts and heterogeneous alternatives generated by random perturbations. We derive the (asymptotic and local) power envelope for two models: an auxiliary model where both the panel units and the random perturbations are observed, and the second one, the model of main interest, for which only the panel units are observed. We show that both models are Locally Asymptotically Normal (LAN). It turns out that there is an information loss: the power envelope for the auxiliary model is above the envelope for the model of main interest. Equality only holds if the alternatives are homogeneous. Our results exactly identify in which setting the unit root test of Moon, Perron, and Phillips (2007) is asymptotically UMP and, in fact, they show it is not possible to exploit possible heterogeneity in the alternatives, confirming a conjecture of Breitung and Pesaran (2008). Moreover, we propose a new asymptotically optimal test and we extend the results to a model allowing for cross-sectional dependence.
    Keywords: panel unit root test;Local Asymptotic Normality;limit experiment;asymptotic power envelope;information loss
    JEL: C22 C23
    Date: 2013
    URL: http://d.repec.org/n?u=RePEc:dgr:kubcen:2013017&r=ecm
  8. By: Kabatek, J. (Tilburg University, Center for Economic Research)
    Abstract: The Expectation-Maximization (EM) algorithm is a well-established estimation procedure which is used in many domains of econometric analysis. Recent application in a discrete choice framework (Train, 2008) facilitated estimation of latent class models allowing for very exible treatment of unobserved heterogeneity. The high exibility of these models is however counterweighted by often excessively long computation times, due to the iterative nature of the EM algorithm. This paper proposes a simple adjustment to the estimation procedure which proves to achieve substantial gains in terms of convergence speed without compromising any of the advantages of the original routine. The enhanced algorithm caps the number of iterations computed by the inner EM loop near its minimum, thereby avoiding optimization over suboptimally populated classes. Performance of the algorithm is assessed on a series of simulations, with the adjusted algorithm being 3-5 times faster than the original routine.
    Keywords: EM algorithm;discrete choice models;latent class models
    JEL: C14 C63
    Date: 2013
    URL: http://d.repec.org/n?u=RePEc:dgr:kubcen:2013019&r=ecm
  9. By: Marc Hallin; Marco Lippi
    Abstract: High-dimensional time series may well be the most common type of dataset in the socalled“big data” revolution, and have entered current practice in many areas, includingmeteorology, genomics, chemometrics, connectomics, complex physics simulations, biologicaland environmental research, finance and econometrics. The analysis of such datasetsposes significant challenges, both from a statistical as from a numerical point of view. Themost successful procedures so far have been based on dimension reduction techniques and,more particularly, on high-dimensional factor models. Those models have been developed,essentially, within time series econometrics, and deserve being better known in other areas.In this paper, we provide an original time-domain presentation of the methodologicalfoundations of those models (dynamic factor models usually are described via a spectralapproach), contrasting such concepts as commonality and idiosyncrasy, factors and commonshocks, dynamic and static principal components. That time-domain approach emphasizesthe fact that, contrary to the static factor models favored by practitioners, the so-called generaldynamic factor model essentially does not impose any constraints on the data-generatingprocess, but follows from a general representation result.
    Date: 2013–03
    URL: http://d.repec.org/n?u=RePEc:eca:wpaper:2013/142428&r=ecm
  10. By: Eran Raviv (Erasmus University Rotterdam)
    Abstract: When the yield curve is modelled using an affine factor model, residuals may still contain relevant information and do not adhere to the familiar white noise assumption. This paper proposes a pragmatic way to improve out of sample performance for yield curve forecasting. The proposed adjustment is illustrated via a pseudo out-of-sample forecasting exercise implementing the widely used Dynamic Nelson Siegel model. Large improvement in forecasting performance is achieved throughout the curve for different forecasting horizons. Results are robust to different time periods, as well as to different model specifications.
    Keywords: Yield curve; Nelson Siegel; Time varying loadings; Factor models
    JEL: E43 E47 G17
    Date: 2013–03–07
    URL: http://d.repec.org/n?u=RePEc:dgr:uvatin:20130041&r=ecm
  11. By: Markku Lanne; Jani Luoto
    Abstract: We propose a noncausal autoregressive model with time-varying parameters, and apply it to U.S. postwar inflation. The model .fits the data well, and the results suggest that inflation persistence follows from future expectations. Persistence has declined in the early 1980.s and slightly increased again in the late 1990.s. Estimates of the new Keynesian Phillips curve indicate that current inflation also depends on past inflation although future expectations dominate. The implied trend inflation estimate evolves smoothly and is well aligned with survey expectations. There is evidence in favor of the variation of trend inflation following from the underlying marginal cost that drives inflation.
    JEL: C22 C51 C53 E31
    Date: 2013
    URL: http://d.repec.org/n?u=RePEc:diw:diwwpp:dp1285&r=ecm

This nep-ecm issue is ©2013 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.