nep-ecm New Economics Papers
on Econometrics
Issue of 2017‒01‒15
twenty papers chosen by
Sune Karlsson
Örebro universitet

  1. Internally Consistent Estimation of Nonlinear Panel Data Models with Correlated Random Effects By Yu-Chin Hsu; Ji-Liang Shiu
  2. Identification-robust moment-based tests for Markov-switching in autoregressive models By Jean-Marie Dufour; Richard Luger
  3. Estimation of Possibly Non-Stationary First-Order Auto-Regressive Processes By Ana Paula Martins
  4. Exogeneity tests, incomplete models, weak identification and non-Gaussian distributions: invariance and finite-sample distributional theory By Firmin Doko Tchatoka; Jean-Marie Dufour
  5. Applying the Fractional Response Model to Survey Research in Accounting By Susanna Gallani; Ranjani Krishnan
  6. On Standard-Error-Decreasing Complementarity: Why Collinearity is Not the Whole Story By Bernd Hayo
  7. Truncated sum of squares estimation of fractional time series models with deterministic trends By Javier Hualde; Morten Ørregaard Nielsen
  8. Reducing bias in nonparametric density estimation via bandwidth dependent kernels: L1 view By Mynbaev, Kairat; Martins-Filho, Carlos
  9. Long Memory, Breaks, and Trends: On the Sources of Persistence in Inflation Rates By Rinke, Saskia; Busch, Marie; Leschinski, Christian
  10. Impulse Response Estimation By Smooth Local Projections By Barnichon, Régis; Brownlees, Christian
  11. Changes in Persistence in Outlier Contaminated Time Series By Hirsch, Tristan; Rinke, Saskia
  12. IT outsourcing and firm productivity: Eliminating bias from selective missingness in the dependent variable By Breunig, Christoph; Kummer, Michael; Ohnemus, Jorg; Viete, Steffen
  13. Automatic Signal Extraction for Stationary and Non-Stationary Time Series by Circulant SSA By Bógalo, Juan; Poncela, Pilar; Senra, Eva
  14. The perils of Counterfactual Analysis with Integrated Processes By Carlos Viana de Carvalho; Ricardo Masini; Marcelo Cunha Medeiros
  15. Cellwise robust regularized discriminant analysis By Stéphanie Aerts; Ines Wilms
  16. A Bayesian Reversible Jump Piecewise Hazard approach for modelling rate changes in mass shootings By Andrew G. Chapple
  17. "Multivariate Stochastic Volatility Model with Realized Volatilities and Pairwise Realized Correlations " By Yuta Yamauchi; Yasuhiro Omori
  18. Stochastic processes of limited frequency and the effects of oversampling By D.S.G. Pollock
  19. Asset correlation estimation for inhomogeneous exposure pools By Christoph Wunderer
  20. Nowcasting Finnish Turnover Indexes Using Firm-Level Data By Fornaro, Paolo; Luomaranta, Henri; Saarinen, Lauri

  1. By: Yu-Chin Hsu (Institute of Economics, Academia Sinica, Taipei, Taiwan); Ji-Liang Shiu (Institute for Economic and Social Research, Jinan University)
    Abstract: This paper investigates identification and estimation of semi-parametric nonlinear panel data models with correlated random effects (CRE). It is shown that under the Mundlak-type CRE specification, the average (or integrated) likelihood is the convolution of the proposed models and the conditional distribution of the unobserved heterogeneity. Then the conditional distribution of the unobserved heterogeneity can be recovered by means of Fourier transformation without imposing any distributional assumptions on it. Combining the proposed the conditional distributions of the outcome variables with the recovered distribution of the unobserved heterogeneity, we can construct a parametric family of average likelihood functions of observables and then show that the parameter vector is identifiable. Based on the identification condition, we propose a semi-parametric two-step maximum likelihood estimator which is root-n consistent and asymptotically normal. Compared with the conventional parametric CRE approaches, the advantage of our method is that it is not subject to the function form misspecification. We investigate the finite sample properties of the proposed estimator through a Monte Carlo study and apply our method to determine the persistence effects of union membership.
    Keywords: Nonlinear panel data models, Semi-parametric identification, Correlated random effects, Semi-parametric two-step maximum likelihood estimator
    Date: 2017–01
    URL: http://d.repec.org/n?u=RePEc:sin:wpaper:17-a002&r=ecm
  2. By: Jean-Marie Dufour; Richard Luger
    Abstract: This paper develops tests of the null hypothesis of linearity in the context of autoregressive models with Markov-switching means and variances. These tests are robust to the identification failures that plague conventional likelihood-based inference methods. The approach exploits the moments of normal mixtures implied by the regime-switching process and uses Monte Carlo test techniques to deal with the presence of an autoregressive component in the model specification. The proposed tests have very respectable power in comparison to the optimal tests for Markov-switching parameters of Carrasco et al. (2014) and they are also quite attractive owing to their computational simplicity. The new tests are illustrated with an empirical application to an autoregressive model of U.S. output growth.
    Keywords: Mixture distributions; Markov chains; Regime switching; Parametric bootstrap; Monte Carlo tests; Exact inference,
    JEL: C12 C15 C22 C52
    Date: 2016–12–31
    URL: http://d.repec.org/n?u=RePEc:cir:cirwor:2016s-63&r=ecm
  3. By: Ana Paula Martins
    Abstract: This paper inspects a grid search algorithm to estimate the AR(1) process, based on the joint estimation of the canonical AR(1) equation along with its reverse form. The method relies on the GLS principle, accounting for the covariance error structure of the special estimable system. Nevertheless, it stands as potentially improving to rely on across-equation-restricted system estimation with free covariance structure. The algorithm is (computationally) implemented and applied to inference of the AR(1) parameter of simulated – some stationary, others non-stationary - series. Additionally, it is argued - and illustrated by simulation - that non-stationary AR(1) processes appear to be consistently estimable by OLS. Also, it is suggested that the parameter of a stationary AR(1) process is estimable by OLS from the AR(2) representation of its non-stationary “first-integrated” series; or from the joint estimate of the canonical and reverse form of the AR(1) process by OLS. Importance of further study of differenced, D(p) – stationary after being integrated p times - processes is concluded.
    Keywords: Nonlinear Estimation; Grid Search Methods; AR(1) Processes; Integrated Series; Differenced Processes; Factored AR(1) Processes; Unit Roots.
    JEL: C22 C13 C12 C63
    Date: 2016–11–21
    URL: http://d.repec.org/n?u=RePEc:eei:rpaper:eeri_rp_2016_21&r=ecm
  4. By: Firmin Doko Tchatoka; Jean-Marie Dufour
    Abstract: We study the distribution of Durbin-Wu-Hausman (DWH) and Revankar-Hartley (RH) tests for exogeneity from a finite-sample viewpoint, under the null and alternative hypotheses. We consider linear structural models with possibly non-Gaussian errors, where structural parameters may not be identified and where reduced forms can be incompletely specified (or nonparametric). On level control, we characterize the null distributions of all the test statistics. Through conditioning and invariance arguments, we show that these distributions do not involve nuisance parameters. In particular, this applies to several test statistics for which no finite-sample distributional theory is yet available, such as the standard statistic proposed by Hausman (1978). The distributions of the test statistics may be non-standard – so corrections to usual asymptotic critical values are needed – but the characterizations are sufficiently explicit to yield finite-sample (Monte-Carlo) tests of the exogeneity hypothesis. The procedures so obtained are robust to weak identification, missing instruments or misspecified reduced forms, and can easily be adapted to allow for parametric non-Gaussian error distributions. We give a general invariance result (block triangular invariance) for exogeneity test statistics. This property yields a convenient exogeneity canonical form and a parsimonious reduction of the parameters on which power depends. In the extreme case where no structural parameter is identified, the distributions under the alternative hypothesis and the null hypothesis are identical, so the power function is flat, for all the exogeneity statistics. However, as soon as identification does not fail completely, this phenomenon typically disappears. We present simulation evidence which confirms the finite-sample theory. The theoretical results are illustrated with two empirical examples: the relation between trade and economic growth, and the widely studied problem of the return of education to earnings.
    Keywords: Exogeneity; Durbin-Wu-Hausman test; weak instrument; incomplete model; non-Gaussian; weak identification; identification robust; finite-sample theory; pivotal; invariance;Monte Carlo test; power.,
    JEL: C3 C12 C15 C52
    Date: 2016–12–31
    URL: http://d.repec.org/n?u=RePEc:cir:cirwor:2016s-62&r=ecm
  5. By: Susanna Gallani (Harvard Business School, Accounting and Management Unit); Ranjani Krishnan (Eli Broad School of Management, Michigan State University)
    Abstract: Survey research studies make extensive use of rating scales to measure constructs of interest. The bounded nature of such scales presents econometric estimation challenges. Linear estimation methods (e.g. OLS) often produce predicted values that lie outside the rating scales, and fail to account for nonconstant effects of the predictors. Established nonlinear approaches such as logit and probit transformations attenuate many shortcomings of linear methods. However, these nonlinear approaches are challenged by corner solutions, for which they require ad hoc transformations. Censored and truncated regressions alter the composition of the sample, while Tobit methods rely on distributional assumptions that are frequently not reflected in survey data, especially when observations fall at one extreme of the scale owing to surveyor and respondent characteristics. The fractional response model (FRM) (Papke and Wooldridge 1996, 2008) overcomes many limitations of established linear and non-linear econometric solutions in the study of bounded data. In this study, we first review the econometric characteristics of the FRM and discuss its applicability to survey-based studies in accounting. Second, we present results from Monte Carlo simulations to highlight the advantages of using the FRM relative to conventional models. Finally, we use data from a hospital patient satisfaction survey, compare the estimation results from a traditional OLS method and the FRM, and conclude that the FRM provides an improved methodological approach to the study of bounded dependent variables.
    Keywords: Fractional response model, bounded variables, simulation
    JEL: C23 C24 C25 C15 I18 M41
    Date: 2015–08
    URL: http://d.repec.org/n?u=RePEc:hbs:wpaper:16-016&r=ecm
  6. By: Bernd Hayo (University of Marburg)
    Abstract: There is a widespread belief among economists that adding additional variables to a regression model causes higher standard errors. This note shows that, in general, this belief is unfounded and that the impact of adding variables on coefficients’ standard errors is unclear. The concept of standard-error-decreasing complementarity is introduced, which works against the collinearity-induced increase in standard errors. How standard-error-decreasing complementarity works is illustrated with the help of a nontechnical heuristic, and, using an example based on artificial data, it is shown that the outcome of popular econometric approaches can be potentially misleading.
    Keywords: Standard-error-decreasing complementarity, multivariate regression model, standard error, econometric methodology, multicollinearity, collinearity
    JEL: C1 B4
    Date: 2017
    URL: http://d.repec.org/n?u=RePEc:mar:magkse:201703&r=ecm
  7. By: Javier Hualde (Universidad Publica de Navarra); Morten Ørregaard Nielsen (Queen's University and CREATES)
    Abstract: We consider truncated (or conditional) sum of squares estimation of a parametric model composed of a fractional time series and an additive generalized polynomial trend. Both the memory parameter, which characterizes the behaviour of the stochastic component of the model, and the exponent parameter, which drives the shape of the deterministic component, are considered not only unknown real numbers, but also lying in arbitrarily large (but finite) intervals. Thus, our model captures different forms of nonstationarity and noninvertibility. As in related settings, the proof of consistency (which is a prerequisite for proving asymptotic normality) is challenging due to non-uniform convergence of the objective function over a large admissible parameter space, but, in addition, our framework is substantially more involved due to the competition between stochastic and deterministic components. We establish consistency and asymptotic normality under quite general circumstances, finding that results differ crucially depending on the relative strength of the deterministic and stochastic components.
    Keywords: Asymptotic normality, consistency, deterministic trend, fractional process, generalized polynomial trend, noninvertibility, nonstationarity, truncated sum of squares estimation
    JEL: C22
    Date: 2017–01
    URL: http://d.repec.org/n?u=RePEc:qed:wpaper:1376&r=ecm
  8. By: Mynbaev, Kairat; Martins-Filho, Carlos
    Abstract: We define a new bandwidth-dependent kernel density estimator that improves existing convergence rates for the bias, and preserves that of the variation, when the error is measured in L1. No additional assumptions are imposed to the extant literature.
    Keywords: Kernel density estimation, higher order kernels, bias reduction
    JEL: C14
    Date: 2016
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:75902&r=ecm
  9. By: Rinke, Saskia; Busch, Marie; Leschinski, Christian
    Abstract: The persistence of inflation rates is of major importance to central banks due to the fact that it determines the costs of monetary policy according to the Phillips curve. This article is motivated by newly available econometric methods which allow for a consistent estimation of the persistence parameter under low frequency contaminations and consistent break point estimation under long memory without a priori assumptions on the presence of breaks. In contrast to previous studies, we allow for smooth trends in addition to breaks as a source of spurious long memory. We support the fi nding of reduced memory parameters in monthly inflation rates of the G7 countries as well as spurious long memory, except for the US. Nevertheless, only a few breaks can be located. Instead, all countries exhibit signi cant trends at the 5 percent level with the exception of the US.
    Keywords: Spurious Long Memory; Breaks; Trends; Inflation; G7 countries
    JEL: C13 E58
    Date: 2017–01
    URL: http://d.repec.org/n?u=RePEc:han:dpaper:dp-584&r=ecm
  10. By: Barnichon, Régis; Brownlees, Christian
    Abstract: Vector Autoregressions (VAR) and Local Projections (LP) are well established methodologies for the estimation of Impulse Responses (IR). These techniques have complementary features: The VAR approach is more efficient when the model is correctly specified whereas the LP approach is less efficient but more robust to model misspecification. We propose a novel IR estimation methodology -- Smooth Local Projections (SLP) -- to strike a balance between these approaches. SLP consists in estimating LP under the assumption that the IR is a smooth function of the forecast horizon. Inference is carried out using semi-parametric techniques based on Penalized B-splines, which are straightforward to implement in practice. SLP preserves the flexibility of standard LP and at the same time can increase precision substantially. A simulation study shows the large gains in IR estimation accuracy of SLP over LP. We show how SLP may be used with common identification schemes such as timing restrictions and instrumental variables to directly recover structural IRs. We illustrate our technique by studying the effects of monetary shocks.
    Keywords: impulse response; local projections; semiparametric estimation
    JEL: C14 C32 C53 E47
    Date: 2016–12
    URL: http://d.repec.org/n?u=RePEc:cpr:ceprdp:11726&r=ecm
  11. By: Hirsch, Tristan; Rinke, Saskia
    Abstract: Outlying observations in time series influence parameter estimation and testing procedures, leading to biased estimates and spurious test decisions. Further inference based on these results will be misleading. In this paper the effects of outliers on the performance of ratio-based tests for a change in persistence are investigated. We consider two types of outliers, additive outliers and innovative outliers. Our simulation results show that the effect of outliers crucially depends on the outlier type and on the degree of persistence of the underlying process. Additive outliers deteriorate the performance of the tests for high degrees of persistence. In contrast, innovative outliers do not negatively influence the performance of the tests. Since additive outliers lead to severe size distortions when the null hypothesis under consideration is described by a nonstationary process, we apply an outlier detection method designed for unit-root testing. The adjustment of the series results in size improvements and power gains. In an empirical example we apply the tests and the outlier detection method to the G7 inflation rates.
    Keywords: Additive Outliers; Innovative Outliers; Change in Persistence; Outlier Detection; Monte Carlo
    JEL: C15 C22
    Date: 2017–01
    URL: http://d.repec.org/n?u=RePEc:han:dpaper:dp-583&r=ecm
  12. By: Breunig, Christoph; Kummer, Michael; Ohnemus, Jorg; Viete, Steffen
    Abstract: Missing values are a major problem in all econometric applications based on survey data. A standard approach assumes data are missing-at-random and uses imputation methods, or even listwise deletion. This approach is justified if item non-response does not depend on the potentially missing variables' realization. However, assuming missing-at-random may introduce bias if non-response is, in fact, selective. Relevant applications range from financial or strategic firm-level data to individual-level data on income or privacy-sensitive behaviors. In this paper, we propose a novel approach to deal with selective item nonresponse in the model's dependent variable. Our approach is based on instrumental variables that affect selection only through potential outcomes. In addition, we allow for endogenous regressors. We establish identification of the structural parameter and propose a simple two-step estimation procedure for it. Our estimator is consistent and robust against biases that would prevail when assuming missingness at random. We implement the estimation procedure using firm-level survey data and a binary instrumental variable to estimate the effect of outsourcing on productivity.
    Keywords: endogenous selection,IV-estimation,inverse probability weighting,missing data,productivity,outsourcing,semiparametric estimation
    JEL: C14 C36 D24 L24
    Date: 2016
    URL: http://d.repec.org/n?u=RePEc:zbw:zewdip:16092&r=ecm
  13. By: Bógalo, Juan; Poncela, Pilar; Senra, Eva
    Abstract: Singular Spectrum Analysis (SSA) is a nonparametric tecnique for signal extraction in time series based on principal components. However, it requires the intervention of the analyst to identify the frequencies associated to the extracted principal components. We propose a new variant of SSA, Circulant SSA (CSSA) that automatically makes this association. We also prove the validity of CSSA for the nonstationary case. Through several sets of simulations, we show the good properties of our approach: it is reliable, fast, automatic and produces strongly separable elementary components by frequency. Finally, we apply Circulant SSA to the Industrial Production Index of six countries. We use it to deseasonalize the series and to illustrate that it also reproduces a cycle in accordance to the dated recessions from the OECD.
    Keywords: circulant matrices, signal extraction, singular spectrum analysis, non-parametric, time series, Toeplitz matrices.
    JEL: C22 E32
    Date: 2017–01–05
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:76023&r=ecm
  14. By: Carlos Viana de Carvalho (Department of Economics, PUC-Rio); Ricardo Masini (São Paulo School of Economics, Getúlio Vargas Foundation); Marcelo Cunha Medeiros (Department of Economics, PUC-Rio)
    Abstract: Recently, there has been a growing interest in developing econometric tools to conduct counterfactual analysis with aggregate data when a “treated” unit suffers an intervention, such as a policy change, and there is no obvious control group. Usually, the proposed methods are based on the construction of an artificial counterfactual from a pool of “untreated” peers, organized in a panel data structure. In this paper, we investigate the consequences of applying such methodologies when the data are formed by integrated process of order 1. We find that without a cointegration relation (spurious case) the intervention estimator diverges resulting in the rejection of the hypothesis of no intervention effect regardless of its existence. Whereas, for the case when at least one cointegration relation exists, we have a vT-consistent estimator for the intervention effect albeit with a non-standard distribution. However, even in this case, the test of no intervention effect is extremely oversized if nonstationarity is ignored. When a drift is present in the data generating processes, the estimator for both cases (cointegrated and spurious) either diverges or is not well defined asymptotically. As a final recommendation we suggest to work in first-differences to avoid spurious results.
    Date: 2016–12
    URL: http://d.repec.org/n?u=RePEc:rio:texdis:654&r=ecm
  15. By: Stéphanie Aerts; Ines Wilms
    Abstract: Quadratic and Linear Discriminant Analysis (QDA/LDA) are the most often applied classification rules under normality. In QDA, a separate covariance matrix is estimated for each group. If there are more variables than observations in the groups, the usual estimates are singular and cannot be used anymore. Assuming homoscedasticity, as in LDA, reduces the number of parameters to estimate. This rather strong assumption is however rarely verified in practice. Regularized discriminant techniques that are computable in high-dimension and cover the path between the two extremes QDA and LDA have been proposed in the literature. However, these procedures rely on sample covariance matrices. As such, they become inappropriate in presence of cellwise outliers, a type of outliers that is very likely to occur in high-dimensional datasets. In this paper, we propose cellwise robust counterparts of these regularized discriminant techniques by inserting cellwise robust covariance matrices. Our methodology results in a family of discriminant methods that (i) are robust against outlying cells, (ii) cover the gap between LDA and QDA and (iii) are computable in high-dimension. The good performance of the new methods is illustrated through simulated and real data examples. As a by-product, visual tools are provided for the detection of outliers.
    Keywords: Cellwise robust precision matrix, Classification, Discriminant analysis, Penalized estimation
    Date: 2017–01
    URL: http://d.repec.org/n?u=RePEc:ete:kbiper:563648&r=ecm
  16. By: Andrew G. Chapple
    Abstract: Time to event data for econometric tragedies, like mass shootings, have largely been ignored from a changepoint analysis standpoint. We outline a technique for modelling economic changepoint problems using a piece- wise constant hazard model to explain different economic phenomenon. Specifically, we investigate the rates of mass shootings in the United States since August 20th 1982 as a case study to examine changes in rates of these terrible events in an attempt to connect changes to the shooter’s covariates or policy and societal changes.
    Keywords: Time-to-event Data, Bayesian Analyses, Piecewise Exponential, Reversible Jump, Mass Shooting.
    JEL: C11 C22
    Date: 2016–11–24
    URL: http://d.repec.org/n?u=RePEc:eei:rpaper:eeri_rp_2016_24&r=ecm
  17. By: Yuta Yamauchi (Graduate School of Economics, The University of Tokyo); Yasuhiro Omori (Faculty of Economics, The University of Tokyo)
    Abstract: Although stochastic volatility and GARCH models have been successful to describe the volatility dynamics of univariate asset returns, their natural extension to the multivariate models with dynamic correlations has been difficult due to several major problems. Firstly, there are too many parameters to estimate if available data are only daily returns, which results in unstable estimates. One solution to this problem is to incorporate additional observations based on intraday asset returns such as realized covariances. However, secondly, since multivariate asset returns are not traded synchronously, we have to use largest time intervals so that all asset returns are observed to compute the realized covariance matrices, where we fail to make full use of available intraday informations when there are less frequently traded assets. Thirdly, it is not straightforward to guarantee that the estimated (and the realized) covariance matrices are positive definite. Our contributions are : (1) we obtain the stable parameter estimates for dynamic correlation models using the realized measures, (2) we make full use of intraday informations by using pairwise realized correlations, (3) the covariance matrices are guaranteed to be positive definite, (4) we avoid the arbitrariness of the ordering of asset returns, (5) propose the flexible correlation structure model (e.g. such as setting some correlations to be identically zeros if necessary), and (6) the parsimonious specification for the leverage effect is proposed. Our proposed models are applied to daily returns of nine U.S. stocks with their realized volatilities and pairwise realized correlations, and are shown to outperform the existing models with regard to portfolio performances.
    Date: 2016–11
    URL: http://d.repec.org/n?u=RePEc:tky:fseres:2016cf1029&r=ecm
  18. By: D.S.G. Pollock
    Abstract: Discrete-time ARMA processes can be placed in a one-to-one correspondence with a set of continuous-time processes that are bounded in frequency by the Nyquist value of ? radians per sample period. It is well known that, if data are sampled from a continuous process of which the maximum frequency exceeds the Nyquist value, then there will be a problem of aliasing. However, if the sampling is too rapid, then other problems will arise that may cause the ARMA estimates to be severely biased. The paper reveals the nature of these problems and it shows how they may be overcome.
    Keywords: ARMA Modelling, Stochastic Differential Equations, Frequency-Limited Stochastic Processes, Oversampling
    JEL: C22 C32 E32
    Date: 2017–01
    URL: http://d.repec.org/n?u=RePEc:lec:leecon:17/03&r=ecm
  19. By: Christoph Wunderer
    Abstract: Asset correlations play an important role in credit portfolio modelling. One possible data source for their estimation are default time series. This study investigates the systematic error that is made if the exposure pool underlying a default time series is assumed to be homogeneous when in reality it is not. We find that the asset correlation will always be underestimated if homogeneity with respect to the probability of default (PD) is wrongly assumed, and the error is the larger the more spread out the PD is within the exposure pool. If the exposure pool is inhomogeneous with respect to the asset correlation itself then the error may be going in both directions, but for most PD- and asset correlation ranges relevant in practice the asset correlation is systematically underestimated. Both effects stack up and the error tends to become even larger if in addition we assume a negative correlation between asset correlation and PD within the exposure pool, an assumption that is plausible in many circumstances and consistent with the Basel RWA formula. It is argued that the generic inhomogeneity effect described in this paper is one of the reasons why asset correlations measured from default data tend to be lower than asset correlations derived from asset value data.
    Date: 2017–01
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1701.02028&r=ecm
  20. By: Fornaro, Paolo; Luomaranta, Henri; Saarinen, Lauri
    Abstract: We adopt a series of shrinkage and factor analytic methodologies to compute nowcasts of the main Finnish turnover indexes, using continuously accumulating firm-level data. We show that the estimates based on large dimensional models provide an accurate and timelier alternative to the ones produced currently by Statistics Finland, even after taking into account data revisions. In particular, we find that the turnovers for the service sector can be estimated with high accuracy five days after the reference month has ended, giving more accurate and faster predictions compared to the first official internal release. For other sectors, the large dimensional models provide a good nowcasting performance, even though there is a timeliness-accuracy trade off. Finally, we propose a factor-based methodology to improve the accuracy of the current flash estimates by imputing part of the data sources, and find that we are able to provide better predictions in a more expedited fashion for all sectors of interest.
    Keywords: Dynamic factor models, firm-level data, nowcasting, shrinkage
    JEL: C31 C53 C55
    Date: 2017–01–10
    URL: http://d.repec.org/n?u=RePEc:rif:wpaper:46&r=ecm

This nep-ecm issue is ©2017 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.