nep-ecm New Economics Papers
on Econometrics
Issue of 2006‒09‒23
nineteen papers chosen by
Sune Karlsson
Orebro University

  1. Robust Small Sample Accurate Inference in Moment Condition Models By Serigne N. Lo; Elvezio Ronchetti
  2. ASYMPTOTICS FOR ESTIMATION OF TRUNCATED INFINITE-DIMENSIONAL QUANTILE REGRESSIONS By Serguei Zernov; Victoria Zindle-Walsh; John Galbraith
  3. Forecasting with panel data By Baltagi, Badi H.
  4. Testing for the Cointegrating Rank of a VAR Process with Level Shift and Trend Break By Carsten Trenkler; Pentti Saikkonen; Helmut Lütkepohl
  5. A quasi maximum likelihood approach for large approximate dynamic factor models By Catherine Doz; Domenico Giannone; Lucrezia Reichlin
  6. IMPROVING THE EFFICIENCY AND ROBUSTNESS OF THE SMOOTHED MAXIMUM SCORE ESTIMATOR By Francisco Alvarez-Cuadrado
  7. NON AND SEMI-PARAMETRIC ESTIMATION IN MODELS WITH UNKNOWN SMOOTHNESS By Yulia Kotlyarova; Victoria Zinde-Walsh
  8. Bootstrap and Fast Double Bootstrap Tests of Cointegration Rank with Financial Time Series By Ahlgren, Niklas; Antell, Jan
  9. Panels with Nonstationary Multifactor Error Structures By George Kapetanios; M. Hashem Pesaran; Takashi Yamagata
  10. HOW FAR CAN WE FORECAST? FORECAST CONTENT HORIZONS FOR SOME IMPORTANT MACROECONOMIC TIME SERIES By John Galbraith; Greg Tkacz
  11. ROBUST KERNEL ESTIMATOR FOR DENSITIES OF UNKNOWN By Yulia Kotlyarova; Victoria Zinde-Walsh
  12. Forecasting Using Predictive Likelihood Model Averaging By George Kapetanios; Vincent Labhard; Simon Price
  13. Nonlinear Models with Strongly Dependent Processes and Applications to Forward Premia and Real Exchange Rates By Richard T. Baillie; George Kapetanios
  14. Empirical Bayesian density forecasting in Iowa and shrinkage for the Monte Carlo era By Lewis, Kurt F.; Whiteman, Charles H.
  15. Forecasting using Bayesian and Information Theoretic Model Averaging: An Application to UK Inflation By George Kapetanios; Vincent Labhard; Simon Price
  16. REDUCED-DIMENSION CONTROL REGRESSION By John Galbraith; Victoria Zinde-Walsh
  17. Stochastic Volatility Driven by Large Shocks By George Kapetanios; Elias Tzavalis
  18. How valid can data fusion be? By Kiesl, Hans; Rässler, Susanne
  19. Measuring Regional Market Integration by Dynamic Factor Error Correction Model (DF-ECM) Approach – The Case of Developing Asia By Duo Qin; Marie Anne Cagas; Geoffrey Ducanes; Nedelyn Magtibay-Ramos; Pilipinas F. Quising

  1. By: Serigne N. Lo; Elvezio Ronchetti
    Abstract: Procedures based on the Generalized Method of Moments (GMM) (Hansen, 1982) are basic tools in modern econometrics. In most cases, the theory available for making inference with these procedures is based on first order asymptotic theory. It is well-known that the (first order) asymptotic distribution does not provide accurate p-values and confidence intervals in moderate to small samples. Moreover, in the presence of small deviations from the assumed model, p-values and confidence intervals based on classical GMM procedures can be drastically affected (nonrobustness). Several alternative techniques have been proposed in the literature to improve the accuracy of GMM procedures. These alternatives address either the first order accuracy of the approximations (information and entropy econometrics (IEE)) or the nonrobustness (Robust GMM estimators and tests). In this paper, we propose a new alternative procedure which combines robustness properties and accuracy in small samples. Specifically, we combine IEE techniques as developed in Imbens, Spady, Johnson (1998) to obtain finite sample accuracy with robust methods obtained by bounding the original orthogonality function as proposed in Ronchetti and Trojani (2001). This leads to new robust estimators and tests in moment condition models with excellent finite sample accuracy. Finally, we illustrate the accuracy of the new statistic by means of some simulations for three models on overidentifying moment conditions.
    Keywords: Exponential tilting, Generalized method of moments, Information and entropy econometrics, Monte Carlo, Robust tests, Saddlepoint techniques
    Date: 2006–06
    URL: http://d.repec.org/n?u=RePEc:gen:geneem:2006.04&r=ecm
  2. By: Serguei Zernov; Victoria Zindle-Walsh; John Galbraith
    Abstract: Many processes can be represented in a simple form as infinite-order linear series. In such cases, an approximate model is often derived as a truncation of the infinite-order process, for estimation on the finite sample. The literature contains a number of asymptotic distributional results for least squares estimation of such finite truncations, but for quantile estimation, only results for finite-order processes are available at a level of generality that accommodates time series processes. Here we establish consistency and asymptotic normality for conditional quantile estimation of truncations of such infinite-order linear models, with the truncation order increasing in sample size. The proofs use the generalized functions approach and allow for a wide range of time series models as well as other forms of regression model. As an example, many time series processes may be represented as an AR(?) or an MA(?); here we use a simulation to illustrate the degree of conformity of finite sample results with the asymptotics, in case of a truncated AR representation of a moving average.
    JEL: C13 C22
    Date: 2006–08
    URL: http://d.repec.org/n?u=RePEc:mcl:mclwop:2006-16&r=ecm
  3. By: Baltagi, Badi H.
    Abstract: This paper gives a brief survey of forecasting with panel data. Starting with a simple error component regression and surveying best linear unbiased prediction under various assumptions of the disturbance term. This includes various ARMA models as well as spatial autoregressive models. The paper also surveys how these forecasts have been used in panal data applications, running horse races between heterogeneous and homogeneous panel data models using out of sample forecasts.
    Keywords: Forecasting, BLUP, Panel Data, Spatial Dependence, Serial Correlation
    JEL: C33
    Date: 2006
    URL: http://d.repec.org/n?u=RePEc:zbw:bubdp1:4754&r=ecm
  4. By: Carsten Trenkler; Pentti Saikkonen; Helmut Lütkepohl
    Abstract: A test for the cointegrating rank of a vector autoregressive (VAR) process with a possible shift and broken linear trend is proposed. The break point is assumed to be known. The setup is a VAR process for cointegrated variables. The tests are not likelihood ratio tests but the deterministic terms including the broken trends are removed first by a GLS procedure and a likelihood ratio type test is applied to the adjusted series. The asymptotic null distribution of the test is derived and it is shown by a Monte Carlo experiment that the test has better small sample properties in many cases than a corresponding Gaussian likelihood ratio test for the cointegrating rank.
    Keywords: Cointegration, structural break, vector autoregressive process, error correction model
    JEL: C32
    Date: 2006–09
    URL: http://d.repec.org/n?u=RePEc:hum:wpaper:sfb649dp2006-067&r=ecm
  5. By: Catherine Doz (Directrice de l'UFR Economie Gestion, University of Cergy-Pontoise - Department of Economics, 33 Boulevard du port, F-95011 Cergy-Pontoise Cedex, France.); Domenico Giannone (Free University of Brussels (VUB/ULB) - European Center for Advanced Research in Economics and Statistics (ECARES), Ave. Franklin D Roosevelt, 50 - C.P. 114, B-1050 Brussels, Belgium.); Lucrezia Reichlin (European Central Bank, Kaiserstrasse 29, 60311 Frankfurt am Main, Germany.)
    Abstract: This paper considers quasi-maximum likelihood estimations of a dynamic approximate factor model when the panel of time series is large. Maximum likelihood is analyzed under different sources of misspecification: omitted serial correlation of the observations and cross-sectional correlation of the idiosyncratic components. It is shown that the effects of misspecification on the estimation of the common factors is negligible for large sample size (T) and the cross-sectional dimension (n). The estimator is feasible when n is large and easily implementable using the Kalman smoother and the EM algorithm as in traditional factor analysis. Simulation results illustrate what are the empirical conditions in which we can expect improvement with respect to simple principle components considered by Bai (2003), Bai and Ng (2002), Forni, Hallin, Lippi, and Reichlin (2000, 2005b), Stock and Watson (2002a,b). JEL Classification: C51, C32, C33.
    Keywords: Factor Model, large cross-sections, Quasi Maximum Likelihood.
    Date: 2006–09
    URL: http://d.repec.org/n?u=RePEc:ecb:ecbwps:20060674&r=ecm
  6. By: Francisco Alvarez-Cuadrado
    Abstract: The binary-response maximum score (MS) estimator is a robust estimator, which can accommodate heteroskedasticity of an unknown form; J. Horowitz (1992) defined a smoothed maximum score estimator SMS) and demonstrated that this improves the convergence rate for sufficiently smooth conditional error densities. In this paper we relax Horowitz’s smoothness assumptions of the model and extend his asymptotic results. We also derive a joint limiting distribution of estimators with different bandwidths and smoothing kernels. We construct an estimator that combines SMS estimators for different bandwidths and kernels to overcome the uncertainty over choice of bandwidth when the degree of smoothnes of error distribution is unknown. A Monte Carlo study demonstrates the gains in efficiency and robustness.
    JEL: C14 C25
    Date: 2006–09
    URL: http://d.repec.org/n?u=RePEc:mcl:mclwop:2004-01&r=ecm
  7. By: Yulia Kotlyarova; Victoria Zinde-Walsh
    Abstract: Many asymptotic results for kernel-based estimators were established under some smoothness assumption on density. For cases where smoothness assumptions that are used to derive unbiasedness or asymptotic rate may not hold we propose a combined estimator that could lead to the best available rate without knowledge of density smoothness. A Monte Carlo example confirms good performance of the combined estimator.
    JEL: C14
    Date: 2006–09
    URL: http://d.repec.org/n?u=RePEc:mcl:mclwop:2006-15&r=ecm
  8. By: Ahlgren, Niklas (Swedish School of Economics and Business Administration); Antell, Jan (Swedish School of Economics and Business Administration)
    Abstract: The likelihood ratio test of cointegration rank is the most widely used test for cointegration. Many studies have shown that its finite sample distribution is not well approximated by the limiting distribution. The article introduces and evaluates by Monte Carlo simulation experiments bootstrap and fast double bootstrap (FDB) algorithms for the likelihood ratio test. It finds that the performance of the bootstrap test is very good. The more sophisticated FDB produces a further improvement in cases where the performance of the asymptotic test is very unsatisfactory and the ordinary bootstrap does not work as well as it might. Furthermore, the Monte Carlo simulations provide a number of guidelines on when the bootstrap and FDB tests can be expected to work well. Finally, the tests are applied to US interest rates and international stock prices series. It is found that the asymptotic test tends to overestimate the cointegration rank, while the bootstrap and FDB tests choose the correct cointegration rank.
    Keywords: Bootstrap; Cointegration; Financial time series; Likelihood ratio test
    Date: 2006–09–14
    URL: http://d.repec.org/n?u=RePEc:hhb:hanken:0519&r=ecm
  9. By: George Kapetanios (Queen Mary, University of London); M. Hashem Pesaran (Cambridge University and Trinity College, Cambridge); Takashi Yamagata (Cambridge University)
    Abstract: The presence of cross-sectionally correlated error terms invalidates much inferential theory of panel data models. Recently work by Pesaran (2006) has suggested a method which makes use of cross-sectional averages to provide valid inference for stationary panel regressions with multifactor error structure. This paper extends this work and examines the important case where the unobserved common factors follow unit root processes and could be cointegrated. It is found that the presence of unit roots does not affect most theoretical results which continue to hold irrespective of the integration and the cointegration properties of the unobserved factors. This finding is further supported for small samples via an extensive Monte Carlo study. In particular, the results of the Monte Carlo study suggest that the cross-sectional average based method is robust to a wide variety of data generation processes and has lower biases than all of the alternative estimation methods considered in the paper.
    Keywords: Cross section dependence, Large panels, Unit roots, Principal components, Common correlated effects
    JEL: C12 C13 C33
    Date: 2006–09
    URL: http://d.repec.org/n?u=RePEc:qmw:qmwecw:wp569&r=ecm
  10. By: John Galbraith; Greg Tkacz
    Abstract: For stationary transformations of variables, there exists a maximum horizon beyond which forecasts can provide no more information about the variable than is present in the unconditional mean. Meteorological forecasts, typically excepting only experimental or exploratory situations, are not reported beyond this horizon; by contrast, little generally-accepted information about such maximum horizons is available for economic variables. In this paper we estimate such content horizons for a variety of economic variables, and compare these with the maximum horizons which we observe reported in a large sample of empirical economic forecasting studies. We find that there are many instances of published studies which provide forecasts exceeding, often by substantial margins, our estimates of the content horizon for the particular variable and frequency. We suggest some simple reporting practices for forecasts that could potentially bring greater transparency to the process of making the interpreting economic forecasts.
    JEL: C53
    Date: 2006–09
    URL: http://d.repec.org/n?u=RePEc:mcl:mclwop:2006-13&r=ecm
  11. By: Yulia Kotlyarova; Victoria Zinde-Walsh
    Abstract: Results on nonparametric kernel estimators of density differ according to the assumed degree of density smoothness; it is often assumed that the density function is at least twice differentiable. However, there are cases where non-smooth density functions may be of interest. We provide asymptotic results for kernel estimation of a continuous density for an arbitrary bandwidth/kernel pair. We also derive the limit joint distribution of kernel density estimators coresponding to different bandwidths and kernel functions. Using these reults, we construct an estimator that combines several estimators for different bandwidth/kernel pairs to protect against the negative consequences of errors in assumptions about order of smoothness. The results of a Monte Carlo experiment confirm the usefulness of the combined estimator. We demonstrate that while in the standard normal case the combined estimator has a relatively higher mean squared error than the standard kernel estimator, both estimators are highly accurate. On the other hand, for a non-smooth density where the MSE gets very large, the combined estimator provides uniformly better results than the standard estimator.
    JEL: C14
    Date: 2006–09
    URL: http://d.repec.org/n?u=RePEc:mcl:mclwop:2005-05&r=ecm
  12. By: George Kapetanios (Queen Mary, University of London); Vincent Labhard (Bank of England); Simon Price (Bank of England and City University)
    Abstract: Recently, there has been increasing interest in forecasting methods that utilise large datasets. We explore the possibility of forecasting with model averaging using the out-of-sample forecasting performance of various models in a frequentist setting, using the predictive likelihood. We apply our method to forecasting UK inflation and find that the new method performs well; in some respects it outperforms other averaging methods.
    Keywords: Forecasting, Inflation, Bayesian model averaging, Akaike criterion, Forecast combining
    JEL: C11 C15 C53
    Date: 2006–09
    URL: http://d.repec.org/n?u=RePEc:qmw:qmwecw:wp567&r=ecm
  13. By: Richard T. Baillie (Michigan State University and Queen Mary, University of London); George Kapetanios (Queen Mary, University of London)
    Abstract: This paper considers estimation and inference in some general non linear time series models which are embedded in a strongly dependent, long memory process. Some new results are provided on the properties of a time domain <i>MLE</i> for these models. The paper also includes a detailed simulation study which compares the time domain <i>MLE</i> with a two step estimator, where the Local Whittle estimator has been initially employed to filter out the long memory component. The time domain <i>MLE</i> is found to be generally superior to two step estimation. Further, the simulation study documents the difficulty of precisely estimating the parameter associated with the speed of transition. Finally, the fractionally integrated, nonlinear autoregressive-<i>ESTAR</i> model is found to be extremely useful in representing some financial time series such as the forward premium and real exchange rates.
    Keywords: Non-linearity, <i>ESTAR</i> models, Strong dependence, Forward premium, Real exchange rates
    JEL: C22 C12 F31
    Date: 2006–09
    URL: http://d.repec.org/n?u=RePEc:qmw:qmwecw:wp570&r=ecm
  14. By: Lewis, Kurt F.; Whiteman, Charles H.
    Abstract: The track record of a sixteen-year history of density forecasts of state tax revenue in Iowa is studied, and potential improvements sought through a search for better performing “priors” similar to that conducted two decades ago for point forecasts by Doan, Litterman, and Sims (Econometric Reviews, 1984). Comparisons of the point- and density-forecasts produced under the flat prior are made to those produced by the traditional (mixed estimation) “Bayesian VAR” methods of Doan, Litterman, and Sims, as well as to fully Bayesian, “Minnesota Prior” forecasts. The actual record, and to a somewhat lesser extent, the record of the alternative procedures studied in pseudo-real-time forecasting experiments, share a characteristic: subsequently realized revenues are in the lower tails of the predicted distributions “too often”. An alternative empirically-based prior is found by working directly on the probability distribution for the VAR parameters, seeking a betterperforming entropically tilted prior that minimizes in-sample mean-squared-error subject to a Kullback-Leibler divergence constraint that the new prior not differ “too much” from the original. We also study the closely related topic of robust prediction appropriate for situations of ambiguity. Robust “priors” are competitive in out-of-sample forecasting; despite the freedom afforded the entropically tilted prior, it does not perform better than the simple alternatives.
    Date: 2006
    URL: http://d.repec.org/n?u=RePEc:zbw:bubdp1:4757&r=ecm
  15. By: George Kapetanios (Queen Mary, University of London); Vincent Labhard (Bank of England); Simon Price (Bank of England)
    Abstract: In recent years there has been increasing interest in forecasting methods that utilise large datasets, driven partly by the recognition that policymaking institutions need to process large quantities of information. Factor analysis is one popular way of doing this. Forecast combination is another, and it is on this that we concentrate. Bayesian model averaging methods have been widely advocated in this area, but a neglected frequentist approach is to use information theoretic based weights. We consider the use of model averaging in forecasting UK inflation with a large dataset from this perspective. We find that an information theoretic model averaging scheme can be a powerful alternative both to the more widely used Bayesian model averaging scheme and to factor models.
    Keywords: Forecasting, Inflation, Bayesian model averaging, Akaike criteria, Forecast combining
    JEL: C11 C15 C53
    Date: 2006–09
    URL: http://d.repec.org/n?u=RePEc:qmw:qmwecw:wp566&r=ecm
  16. By: John Galbraith; Victoria Zinde-Walsh
    Abstract: A model to investigate the relationship between one variable and another usually requires controls for numerous other effects which are not constant across the sample; where the model omits some elements of the true process, estimates of parameters of interest will typically be inconsistent. Here we investigate conditions under which, with a set of potential controls which is large (possibly infinite), orthogonal transformations of a subset of potential controls can nonetheless be used in a parsimonious regression involving a reduced number of orthogonal components (the ‘reduced-dimension control regression’), to produce consistent (and asymptotically normal, given further restrictions) estimates of a parameter of interest, in a general setting. We examine selection of the particular orthogonal directions, using a new criterion which takes into account both the magnitude of the eigenvalue and the correlation of the eigenvector with the variable of interest. Simulation experiments show good finite-sample performance of the method.
    JEL: C13 C22
    Date: 2006–08
    URL: http://d.repec.org/n?u=RePEc:mcl:mclwop:2006-17&r=ecm
  17. By: George Kapetanios (Queen Mary, University of London); Elias Tzavalis (Queen Mary, University of London)
    Abstract: This paper presents a new model of stochastic volatility which allows for infrequent shifts in the mean of volatility, known as structural breaks. These are endogenously driven from large innovations in stock returns arriving in the market. The model has a number of interesting properties. Among them, it can allow for shifts in volatility which are of stochastic timing and magnitude. This model can be used to distinguish permanent shifts in volatility coming from large pieces of news arriving in the market, from ordinary volatility shocks.
    Keywords: Stochastic volatility, Structural breaks
    JEL: C22 C15
    Date: 2006–09
    URL: http://d.repec.org/n?u=RePEc:qmw:qmwecw:wp568&r=ecm
  18. By: Kiesl, Hans (Institut für Arbeitsmarkt- und Berufsforschung (IAB), Nürnberg [Institute for Employment Research, Nuremberg, Germany]); Rässler, Susanne (Institut für Arbeitsmarkt- und Berufsforschung (IAB), Nürnberg [Institute for Employment Research, Nuremberg, Germany])
    Abstract: "Data fusion techniques typically aim to achieve a complete data file from different sources which do not contain the same units. Traditionally, this is done on the basis of variables common to all files. It is well known that those approaches establish conditional independence of the specific variables given the common variables, although they may be conditionally dependent in reality. We discuss the objectives of data fusion in the light of their feasibility and distinguish four levels of validity that a fusion technique may achieve. For a rather general situation, we derive the feasible set of correlation matrices for the variables not jointly observed and suggest a new quality index for data fusion. Finally, we present a suitable and effcient multiple imputation procedure to make use of auxiliary information and to overcome the conditional independence assumption." (author's abstract, IAB-Doku) ((en))
    Keywords: Daten, Datenaufbereitung, Datenqualität, Korrelation, Validität, angewandte Statistik, mathematische Statistik
    JEL: C11 C15 C81
    Date: 2006–08–31
    URL: http://d.repec.org/n?u=RePEc:iab:iabdpa:200615&r=ecm
  19. By: Duo Qin (Queen Mary, University of London); Marie Anne Cagas (University of the Philippines); Geoffrey Ducanes (University of the Philippines); Nedelyn Magtibay-Ramos (Asian Development Bank); Pilipinas F. Quising (Asian Development Bank)
    Abstract: This paper examines empirically the dynamic process of regional market integration for twelve individual Asian economies by a new modeling approach, which combines DF with ECM. This new approach enables us to obtain latent regional dynamic factors, which correspond well with the ‘foreign’ parity variables in theory when market is imperfectly integrated and which act, in explaining domestic short-run price adjustments, as leading-indicators in an error-correction form. The power of the DF-ECM approach is illustrated in its application to measuring market integration in the developing Asian region using monthly data of the past decade.
    Keywords: Law of one price, Market integration, Dynamic factor, Error-correction model
    JEL: F31 F40 F15 C22 C33
    Date: 2006–09
    URL: http://d.repec.org/n?u=RePEc:qmw:qmwecw:wp565&r=ecm

This nep-ecm issue is ©2006 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.