nep-ecm New Economics Papers
on Econometrics
Issue of 2014‒06‒14
eleven papers chosen by
Sune Karlsson
Orebro University

  1. Low-dimensional decomposition, smoothing and forecasting of sparse functional data By Alexander Dokumentov; Rob J Hyndman
  2. On the Size Distortion from Linearly Interpolating Low-frequency Series for Cointegration Tests By Eric Ghysels; J. Isaac Miller
  3. On Forecast Evaluation By Wilmer Osvaldo Martínez-Rivera; Manuel Dario Hernández-Bejarano; Juan Manuel Julio-Román
  4. Inextricability of Autonomy and Confluence in Econometrics By Duo Qin
  5. A Modified Confidence Set for the Structural Break Date in Linear Regression Models By Yamamoto, Yohei
  6. Fast computation of reconciled forecasts for hierarchical and grouped time series By Rob J Hyndman; Alan Lee; Earo Wang
  7. Interpreting Financial Market Crashes as Earthquakes: A New early Warning System for Medium Term Crashes By Francine Gresnigt; Erik Kole; Philip Hans Franses
  8. Theory and practice of GVAR modeling By Chudik, Alexander; Pesaran, M. Hashem
  9. Stochastic Frontier Models for Long Panel Data Sets: Measurement of the Underlying Energy Efficiency for the OECD Countries By Massimo Filippini; Elisa Tosetti
  10. A multiple indicator model for panel data: an application to ICT area-level variation By Eva Ventura; Albert Satorra
  11. Entropy methods for identifying hedonic models By DUPUY Arnaud; GALICHON Alfred; HENRY Marc

  1. By: Alexander Dokumentov; Rob J Hyndman
    Abstract: We propose a new generic method ROPES (Regularized Optimization for Prediction and Estimation with Sparse data) for decomposing, smoothing and forecasting two-dimensional sparse data. In some ways, ROPES is similar to Ridge Regression, the LASSO, Principal Component Analysis (PCA) and Maximum-Margin Matrix Factorisation (MMMF). Using this new approach, we propose a practical method of forecasting mortality rates, as well as a new method for interpolating and extrapolating sparse longitudinal data. We also show how to calculate prediction intervals for the resulting estimates.
    Keywords: Tikhonov regularisation, Smoothing, Forecasting, Ridge regression, PCA, LASSO, Maximum-margin matrix factorisation, Mortality rates, Sparse longitudinal data
    JEL: C10 C14 C33
    Date: 2014
    URL: http://d.repec.org/n?u=RePEc:msh:ebswps:2014-16&r=ecm
  2. By: Eric Ghysels; J. Isaac Miller (Department of Economics, University of Missouri-Columbia)
    Abstract: We analyze the sizes of standard cointegration tests applied to data subject to linear interpolation, discovering evidence of substantial size distortions induced by the interpolation. We propose modifications to these tests to effectively eliminate size distortion from such tests conducted on data interpolated from end-of-period sampled low-frequency series. Our results generally do not support linear interpolation when alternatives such as aggregation or mixed-frequency-modified tests are possible.
    Keywords: linear interpolation, cointegration, trace test, residual-based cointegration tests
    JEL: C12 C32
    Date: 2014–01–15
    URL: http://d.repec.org/n?u=RePEc:umc:wpaper:1403&r=ecm
  3. By: Wilmer Osvaldo Martínez-Rivera; Manuel Dario Hernández-Bejarano; Juan Manuel Julio-Román
    Abstract: We propose to assess the performance of k forecast procedures by exploring the distributions of forecast errors and error losses. We argue that non systematic forecast errors minimize when their distributions are symmetric and unimodal, and that forecast accuracy should be assessed through stochastic loss order rather than expected loss order, which is the way it is customarily performed in previous work. Moreover, since forecast performance evaluation can be understood as a one way analysis of variance, we propose to explore loss distributions under two circumstances; when a strict (but unknown) joint stochastic order exists among the losses of all forecast alternatives, and when such order happens among subsets of alternative procedures. In spite of the fact that loss stochastic order is stronger than loss moment order, our proposals are at least as powerful as competing tests, and are robust to the correlation, autocorrelation and heteroskedasticity settings they consider. In addition, since our proposals do not require samples of the same size, their scope is also wider, and provided that they test the whole loss distribution instead of just loss moments, they can also be used to study forecast distributions as well. We illustrate the usefulness of our proposals by evaluating a set of real world forecasts.
    Keywords: Forecast evaluation, Stochastic order, Multiple comparison.
    JEL: C53 C12 C14
    Date: 2014–06–06
    URL: http://d.repec.org/n?u=RePEc:col:000094:011604&r=ecm
  4. By: Duo Qin (Department of Economics, SOAS, University of London, UK)
    Abstract: This paper examines how ‘confluence’ and ‘autonomy’, two key concepts introduced by Frisch around 1930, have disappeared in econometrics textbooks and why only some fragments of the two have survived mainstream econometrics. It relates the disappearance to the defect in the textbook position of equating a priori theoretical models as correct structural models unequivocally. It shows how the confluence-autonomy pair in unity reflects the complexity of econometricians’ goal to find and verify robust and economically meaningful models out of numerous interdependent variables observable from the real and open world. The complexity deems it essential for applied research to have a set of model design rules combining both a priori substantive reasoning and a posteriori statistical testing. Such a need also puts the task of ensuring an adequately minimum model closure to top priority for applied modellers, a task much more important than the textbook task of parameter estimation.
    Keywords: exogeneity, structural invariance, omitted variable bias, multicollinearity, model selection and design
    JEL: B23 C13 C18 C50
    Date: 2014–06
    URL: http://d.repec.org/n?u=RePEc:soa:wpaper:189&r=ecm
  5. By: Yamamoto, Yohei
    Abstract: Elliott and Müller (2007) (EM) provides a method to construct a confidence set for the structural break date by inverting a locally best test statistic. Previous studies show that the EM method produces a set with an accurate coverage ratio even for a small break, however, the set is often overly lengthy. This study proposes a simple modification to rehabilitate their method. Following the literature, we provide an asymptotic justification for the modified method under a nonlocal asymptotic framework. A Monte Carlo simulation shows that like the original method, the modified method exhibits a coverage ratio that is very close to the nominal level. More importantly, it achieves a much shorter confidence set. Hence, when the break is small, the modified method serves as a better alternative to Bai's (1997) confidence set. We apply these methods to a small level shift in post-1980s Japanese inflation data.
    Keywords: coverage ratio, nonlocal asymptotics, heteroskedasticity and autocorrelation consistent covariance, condence set
    JEL: C12 C38
    Date: 2014–05–07
    URL: http://d.repec.org/n?u=RePEc:hit:econdp:2014-08&r=ecm
  6. By: Rob J Hyndman; Alan Lee; Earo Wang
    Abstract: We describe some fast algorithms for reconciling large collections of time series forecasts with aggregation constraints. The constraints arise due to the need for forecasts of collections of time series with hierarchical or grouped structures to add up in the same manner as the observed time series. We show that the least squares approach to reconciling hierarchical forecasts can be extended to more general non-hierarchical groups of time series, and that the computations can be handled efficiently by exploiting the structure of the associated design matrix. Our algorithms will reconcile hierarchical forecasts with hierarchies of unlimited size, making forecast reconciliation feasible in business applications involving very large numbers of time series.
    Keywords: combining forecasts, grouped time series, hierarchical time series, reconciling forecasts, weighted least squares.
    JEL: C32 C53 C63
    Date: 2014
    URL: http://d.repec.org/n?u=RePEc:msh:ebswps:2014-17&r=ecm
  7. By: Francine Gresnigt (Erasmus University Rotterdam); Erik Kole (Erasmus University Rotterdam); Philip Hans Franses (Erasmus University Rotterdam)
    Abstract: We propose a modeling framework which allows for creating probability predictions on a future market crash in the medium term, like sometime in the next five days. Our framework draws upon noticeable similarities between stock returns around a financial market crash and seismic activity around earthquakes. Our model is incorporated in an Early Warning System for future crash days. Testing our EWS on S&P 500 data during the recent financial crisis, we find positive Hanssen-Kuiper Skill Scores. Furthermore our modeling framework is capable of exploiting information in the returns series not captured by well known and commonly used volatility models. EWS based on our models outperform EWS based on the volatility models forecasting extreme price movements, while forecasting is much less time-consuming.
    Keywords: Financial crashes; Hawkes process; self-exciting process; Early Warning System
    JEL: C13 C15 C53 G17
    Date: 2014–06–03
    URL: http://d.repec.org/n?u=RePEc:dgr:uvatin:20140067&r=ecm
  8. By: Chudik, Alexander (Federal Reserve Bank of Dallas); Pesaran, M. Hashem (University of Southern California and Trinity College)
    Abstract: The Global Vector Autoregressive (GVAR) approach has proven to be a very useful approach to analyze interactions in the global macroeconomy and other data networks where both the cross-section and the time dimensions are large. This paper surveys the latest developments in the GVAR modeling, examining both the theoretical foundations of the approach and its numerous empirical applications. We provide a synthesis of existing literature and highlight areas for future research.
    Keywords: Global Vector Autoregressive; global macroeconomy
    JEL: C32 E17
    Date: 2014–05–01
    URL: http://d.repec.org/n?u=RePEc:fip:feddgw:180&r=ecm
  9. By: Massimo Filippini (ETH Zurich, Switzerland); Elisa Tosetti (ETH Zurich, Switzerland)
    Abstract: In this paper we propose a general approach for estimating stochastic frontier mod- els, suitable when using long panel data sets. We measure efficiency as a linear combi- nation of a finite number of unobservable common factors, having coefficients that vary across firms, plus a time-invariant component. We adopt recently developed economet- ric techniques for large, cross sectionally correlated, non-stationary panel data models to estimate the frontier function. Given the long time span of the panel, we investigate whether the variables, including the unobservable common factors, are non-stationary, and, if so, whether they are cointegrated. To empirically illustrate our approach, we estimate a stochastic frontier model for energy demand, and compute the level of the “underlying energy efficiency” for 24 OECD countries over the period 1980 to 2008. In our specification, we control for variables such as Gross Domestic Product, energy price, climate and technological progress, that are known to impact on energy consumption. We also allow for hetero- geneity across countries in the impact of these factors on energy demand. Our panel unit root tests suggest that energy demand and its key determinants are integrated and that they exhibit a long-run relation. The estimation of efficiency scores points at European countries as the more efficient in consuming energy.
    Keywords: Energy demand; panels; common factors; principal components.
    JEL: C10 C31 C33
    Date: 2014–06
    URL: http://d.repec.org/n?u=RePEc:eth:wpswif:14-198&r=ecm
  10. By: Eva Ventura; Albert Satorra
    Abstract: Consider the case in which we have data from repeated surveys covering several geographic areas, and our goal is to characterize these areas on a latent trait that underlies multiple indicators. This characterization occurs, for example, in surveys of information and communication technologies (ICT) conducted by statistical agencies, the objective of which is to assess the level of ICT in each area and its variation over time. It is often of interest to evaluate the impact of area-specific covariates on the ICT level of the area. This paper develops a methodology based on structural equations models (SEMs) that allows not only the ability to estimate the level of the latent trait in each of the areas (building an ICT index) but also to assess the variation of this index in time, as well as its association with the area-specific covariates. The methodology is illustrated using the ICT annual survey data collected in the Spanish region of Catalonia for the years 2008 to 2011.
    Keywords: structural equations model; confirmatory factor analysis; longitudinal analysis; index; digital divide; Information and Communication Technologies (ICT)
    Date: 2014–05
    URL: http://d.repec.org/n?u=RePEc:upf:upfgen:1419&r=ecm
  11. By: DUPUY Arnaud; GALICHON Alfred; HENRY Marc
    Abstract: This paper contributes to the literature on hedonic models in two ways. First, it makes use of Queyranne's reformulation of a hedonic model in the discrete case as a network flow problem in order to provide a proof of existence and integrality of a hedonic equilibrium and efficient computational techniques of hedonic prices. Second, elaborating on entropic methods developped in Galichon and Salanié (2014), this paper proposes a new identification strategy for hedonic models in a single market. This methodology allows one to introduce heterogeneities in both consumers' and producers' attributes and to recover producers' profits and consumers' utilities based on the observation of production and consumption patterns and the set of hedonic prices.
    Keywords: Hedonic models; Entropic methods; Identification
    JEL: D12 J30 L11
    Date: 2014–06
    URL: http://d.repec.org/n?u=RePEc:irs:cepswp:2014-07&r=ecm

This nep-ecm issue is ©2014 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.