nep-ecm New Economics Papers
on Econometrics
Issue of 2015‒06‒13
twenty-two papers chosen by
Sune Karlsson
Örebro universitet

  1. Medium Band Least Squares Estimation of Fractional Cointegration in the Presence of Low-Frequency Contamination By Bent Jesper Christensen; Rasmus T. Varneskov
  2. A Local Stable Bootstrap for Power Variations of Pure-Jump Semimartingales and Activity Index Estimation By Ulrich Hounyo; Rasmus T. Varneskov
  3. Filtering and likelihood estimation of latent factor jump-diffusions with an application to stochastic volatility models By esposito, francesco paolo; cummins, mark
  4. Testing For Unit Roots With Cointegrated Data By W. Robert Reed
  5. "Trend, Seasonality and Economic Time Sseries:the Nonstationary Errors-in-variables Models" By Naoto Kunitomo; Seisho Sato
  6. "Heteroscedastic Nested Error Regression Models with Variance Functions" By Shonosuke Sugasawa; Tatsuya Kubokawa
  7. Estimating LASSO Risk and Noise Level By Bayai, Mohsen; Erdogdu, Murat A.; Montanari, Andrea
  8. Asymptotic Properties of QML Estimators for VARMA Models with Time-Dependent Coefficients: Part I By Abdelkamel Alj; Christophe Ley; Guy Melard
  9. Testing for the Diffusion Matrix in a Continuous-Time Markov Process Model with Applications to the Term Structure of Interest Rates By Fuchun Li
  10. Testing for Breaks in Regression Models with Dependent Data By Violetta Dalla; Javier Hidalgo
  11. Efficient Inference with Time-Varying Information and the New Keynesian Phillips Curve By Bertille Antoine; Otilia
  12. Multiple hypothesis testing of market risk forecasting models By esposito, francesco paolo; cummins, mark
  13. Efficient estimation of Bayesian VARMAs with time-varying coefficients By Joshua C.C. Chan; Eric Eisenstat
  14. Inference Problems under a Special Form of Heteroskedasticity By Farbmacher, Helmut; Kögel, Heinrich
  15. The Econometrics of Networks: A Review By Daniel Felix Ahelegbey
  16. Regularized Estimation of Structural Instability in Factor Models: The US Macroeconomy and the Great Moderation By Laurent Callot; Johannes Tang Kristensen
  17. Error and Generalization in Discrete Choice Under Risk By Nathaniel T. Wilcox
  18. Joint modelling compared with two stage methods for analysing longitudinal data and prospective outcomes: a simulation study of childhood growth and BP By A. Sayers; J. Heron; A. Smith; C. Macdonald-Wallis; M. Gilthorpe; F. Steele; K. Tilling
  19. Data Uncertainty in Markov Chains: Application to Cost-Effectiveness Analyses of Medical Innovations By Goh, Joel; Bayati, Mohsen; Zenios, Stefanos A.; Singh, Sundeep; Moore, David
  20. Long Memory Through Marginalization of Large Systems and Hidden Cross-Section Dependence By Chevillon, Guillaume; Hecq , Alain; Laurent, Sébastien
  21. Overcoming the Forecast Combination Puzzle: Lessons from the Time-Varying Effciency of Phillips Curve Forecasts of U.S. Inflation By Christopher G. Gibbs
  22. A New Approach to Estimation of the R&D-Innovation-Productivity Relationship By Christopher F Baum; Hans Lööf; Pardis Nabavi; Andreas Stephan

  1. By: Bent Jesper Christensen (Aarhus University and CREATES); Rasmus T. Varneskov (Aarhus University and CREATES)
    Abstract: This paper introduces a new estimator of the fractional cointegrating vector between stationary long memory processes that is robust to low-frequency contamination such as level shifts, i.e., structural changes in the means of the series, and deterministic trends. In particular, the proposed medium band least squares (MBLS) estimator uses sample dependent trimming of frequencies in the vicinity of the origin to account for such contamination. Consistency and asymptotic normality of the MBLS estimator are established, a feasible inference procedure is proposed, and rigorous tools for assessing the cointegration strength and testing MBLS against the existing narrow band least squares estimator are developed. Finally, the asymptotic framework for the MBLS estimator is used to provide new perspectives on volatility factors in an empirical application to long-span realized variance series for S&P 500 equities.
    Keywords: Deterministic Trends, Factor Models, Fractional Cointegration, Long Memory, Realized Variance, Semiparametric Estimation, Structural Change
    JEL: C12 C14 C32 C58
    Date: 2015–05–27
    URL: http://d.repec.org/n?u=RePEc:aah:create:2015-25&r=ecm
  2. By: Ulrich Hounyo (Oxford-Man Institute, University of Oxford, and Aarhus University and CREATES); Rasmus T. Varneskov (Aarhus University and CREATES)
    Abstract: We provide a new resampling procedure - the local stable bootstrap - that is able to mimic the dependence properties of realized power variations for pure-jump semimartingales observed at different frequencies. This allows us to propose a bootstrap estimator and inference procedure for the activity index of the underlying process, ß, as well as a bootstrap test for whether it obeys a jump-diffusion or a pure-jump process, that is, of the null hypothesis H0: ß=2 against the alternative H1: ß<2. We establish first-order asymptotic validity of the resulting bootstrap power variations, activity index estimator, and diffusion test for H0. Moreover, the finite sample size and power properties of the proposed diffusion test are compared to those of benchmark tests using Monte Carlo simulations. Unlike existing procedures, our bootstrap test is correctly sized in general settings. Finally, we illustrate use and properties of the new bootstrap diffusion test using high-frequency data on three FX series, the S&P 500, and the VIX.
    Keywords: Activity index, Bootstrap, Blumenthal-Getoor index, Confidence Intervals, Highfrequency Data, Hypothesis Testing, Realized Power Variation, Stable Processes
    JEL: C12 C14 C15 G1
    Date: 2015–05–27
    URL: http://d.repec.org/n?u=RePEc:aah:create:2015-26&r=ecm
  3. By: esposito, francesco paolo; cummins, mark
    Abstract: In this article we use a partial integral-differential approach to construct and extend a non-linear filter to include jump components in the system state. We employ the enhanced filter to estimate the latent state of multivariate parametric jump-diffusions. The devised procedure is flexible and can be applied to non-affine diffusions as well as to state dependent jump intensities and jump size distributions. The particular design of the system state can also provide an estimate of the jump times and sizes. With the same approch by which the filter has been devised, we implement an approximate likelihood for the parameter estimation of models of the jump-diffusion class. In the development of the estimation function, we take particular care in designing a simplified algorithm for computing. The likelihood function is then characterised in the application to stochastic volatility models with jumps. In the empirical section we validate the proposed approach via Monte Carlo experiments. We deal with the volatility as an intrinsic latent factor, which is partially observable through the integrated variance, a new system state component that is introduced to increase the filtered information content, allowing a closer tracking of the latent volatility factor. Further, we analyse the structure of the measurement error, particularly in relation to the presence of jumps in the system. In connection to this, we detect and address an issue arising in the update equation, improving the system state estimate.
    Keywords: latent state-variables, non-linear filtering, finite difference method, multi-variate jump-diffusions, likelihood estimation
    JEL: C13
    Date: 2015–05–01
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:64987&r=ecm
  4. By: W. Robert Reed (University of Canterbury)
    Abstract: This paper demonstrates that unit root tests can suffer from inflated Type I error rates when data are cointegrated. Results from Monte Carlo simulations show that three commonly used unit root tests – the ADF, Phillips-Perron, and DF-GLS tests – frequently overreject the true null of a unit root for at least one of the cointegrated variables. The reason for this overrejection is that unit root tests, designed for random walk data, are often misspecified when data are cointegrated. While the addition of lagged differenced (LD) terms can eliminate the size distortion, this “success” is spurious, driven by collinearity between the lagged dependent variable and the LD explanatory variables. Accordingly, standard diagnostics such as (i) testing for serial correlation in the residuals and (ii) using information criteria to select among different lag specifications are futile. The implication of these results is that researchers should be conservative in the weight
    Keywords: Unit root testing, cointegration, DF-GLS test, Augmented Dickey-Fuller test, Phillips-Perron test, simulation
    JEL: C32 C22 C18
    Date: 2015–05–30
    URL: http://d.repec.org/n?u=RePEc:cbt:econwp:15/11&r=ecm
  5. By: Naoto Kunitomo (Faculty of Economics, The University of Tokyo); Seisho Sato (Faculty of Economics, The University of Tokyo)
    Abstract: The use of seasonally adjusted (official) data may have statistical problem because it is a common practice to use <i>X-12-ARIMA</i> in <i>the official seasonal adjustment</i>, which adopts the univariate ARIMA time series modeling with some renements. Instead of using the seasonally adjusted data, for estimating the structural parameters and relationships among non-stationary economic time series with seasonality and noise, we propose a new method called the Separating Information Maximum Likelihood (SIML) estimation. We show that the SIML estimation can identify the nonstationary trend, the seasonality and the noise components, which have been observed in many macro-economic time series, and recover the structural parameters and relationships among the non-stationary trends with seasonality. The SIML estimation is consistent and it has the asymptotic normality when the sample size is large. Based on simulations, we nd that the SIML estimator has reasonable nite sample properties and thus it would be useful for practice. --
    Date: 2015–06
    URL: http://d.repec.org/n?u=RePEc:tky:fseres:2015cf977&r=ecm
  6. By: Shonosuke Sugasawa (Graduate School of Economics, The University of Tokyo); Tatsuya Kubokawa (Faculty of Economics, The University of Tokyo)
    Abstract: The article considers a nested error regression model with heteroscedastic variance functions for analyzing clustered data, where the normality for the underlying distributions is not assumed. Classical methods in normal nested error regression models with homogenous variances are extended in the two directions: heterogeneous variance functions for error terms and non-normal distributions for random effects and error terms. Consistent estimators for model parameters are suggested, and second-order approximations of their biases and variances are derived. The mean squared errors of the empirical best linear unbiased predictors are expressed explicitly to second-order. Second-order unbiased estimators of the mean squared errors are provided analytically in closed forms. The proposed model and the resulting procedures are numerically investigated through simulation and empirical studies. --
    Date: 2015–06
    URL: http://d.repec.org/n?u=RePEc:tky:fseres:2015cf978&r=ecm
  7. By: Bayai, Mohsen (Stanford University); Erdogdu, Murat A. (Stanford University); Montanari, Andrea (Stanford University)
    Abstract: We study the fundamental problems of variance and risk estimation in high dimensional statistical modeling. In particular, we consider the problem of learning a coefficient vector Theta 0 is an element of Rp from noisy linear observations y = X Theta 0 + w is an element of Rn (p > n) and the popular estimation procedure of solving the '1-penalized least squares objective known as the LASSO or Basis Pursuit DeNoising (BPDN). In this context, we develop new estimators for the '2 estimation risk k Theta b- Theta 0k2 and the variance of the noise when distributions of Theta 0 and w are unknown. These can be used to select the regularization parameter optimally. Our approach combines Stein's unbiased risk estimate [Ste81] and the recent results of [BM12a] [BM12b] on the analysis of approximate message passing and the risk of LASSO. We establish high-dimensional consistency of our estimators for sequences of matrices X of increasing dimensions, with independent Gaussian entries. We establish validity for a broader class of Gaussian designs, conditional on a certain conjecture from statistical physics. To the best of our knowledge, this result is the first that provides an asymptotically consistent risk estimator for the LASSO solely based on data. In addition, we demonstrate through simulations that our variance estimation outperforms several existing methods in the literature.
    Date: 2015
    URL: http://d.repec.org/n?u=RePEc:ecl:stabus:3284&r=ecm
  8. By: Abdelkamel Alj; Christophe Ley; Guy Melard
    Keywords: non-stationary process; multivariate time series; time-varying models
    Date: 2015–06
    URL: http://d.repec.org/n?u=RePEc:eca:wpaper:2013/200183&r=ecm
  9. By: Fuchun Li
    Abstract: The author proposes a test for the parametric specification of each component in the diffusion matrix of a d-dimensional diffusion process. Overall, d (d-1)/2 test statistics are constructed for the off-diagonal components, while d test statistics are constructed for the main diagonal components. Using theories of degenerate U-statistics, each of these test statistics is shown to follow an asymptotic standard normal distribution under null hypothesis, while diverging to infinity if the component is misspecified over a significant range. Our tests strongly reject the specification of diffusion functions in a variety of popular univariate interest rate models for daily 7-day eurodollar spot rates, and the specification of the diffusion matrix in some popular multivariate affine term-structure models for monthly U.S. Treasury yields.
    Keywords: Asset Pricing, Econometric and statistical methods, Interest rates
    JEL: C12 C14 E17 E43 G12 G20
    Date: 2015
    URL: http://d.repec.org/n?u=RePEc:bca:bocawp:15-17&r=ecm
  10. By: Violetta Dalla; Javier Hidalgo
    Keywords: Nonparametric regression, Breaks/smoothness, Strong dependence, Extreme-values distribution, Frequency domain bootstrap algorithms.
    JEL: C14 C22
    Date: 2015–03
    URL: http://d.repec.org/n?u=RePEc:cep:stiecm:/2015/584&r=ecm
  11. By: Bertille Antoine (Simon Fraser University); Otilia (Tilburg University)
    Abstract: Decades of empirical evidence suggest that many macroeconometric and financial models are subject to both instability and identification problems. In this paper, we address both issues under the unified framework of time-varying information, which includes changes in instrument strength, changes in the second moment of instruments, and changes in the variance of moment conditions. We develop a comprehensive econometric method that detects and exploits these changes to increase the efficiency of the estimates of the (stable) structural parameters. We estimate a New Keynesian Phillips Curve and obtain more precise estimates of the price indexation parameters than standard methods. An extensive simulation study also shows that our method delivers substantial efficiency gains in finite samples.
    Keywords: GMM, Weak instruments, Break point, Change in identification strength
    JEL: C13 C22 C26 C36 C51
    Date: 2015–06–04
    URL: http://d.repec.org/n?u=RePEc:sfu:sfudps:dp15-04&r=ecm
  12. By: esposito, francesco paolo; cummins, mark
    Abstract: Extending previous risk model backtesting literature, we construct multiple hypothesis testing (MHT) with the stationary bootstrap. We conduct multiple tests which control for the generalized confidence level and employ the bootstrap MHT to design multiple comparison testing. We consider absolute and relative predictive ability to test a range of competing risk models, focusing on Value-at-Risk (VaR) and Expected Shortfall (ExS). In devising the test for the absolute predictive ability, we take the route of recent literature and construct balanced simultaneous confidence sets that control for the generalized family-wise error rate, which is the joint probability of rejecting true hypotheses. We implement a step-down method which increases the power of the MHT in isolating false discoveries. In testing for the ExS model predictive ability, we design a new simple test to draw inference about recursive model forecasting capability. In the second suite of statistical testing, we develop a novel device for measuring the relative predictive ability in the bootstrap MHT framework. The device, we coin multiple comparison mapping, provides a statistically robust instrument designed to answer the question: ''which model is the best model?''.
    Keywords: value-at-risk, expected shortfall, bootstrap multiple hypothesis testing, generalized familywise error rate, multiple comparison map
    JEL: C12
    Date: 2015–03–01
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:64986&r=ecm
  13. By: Joshua C.C. Chan; Eric Eisenstat
    Abstract: Empirical work in macroeconometrics has been mostly restricted to using VARs, even though there are strong theoretical reasons to consider general VARMAs. A number of articles in the last two decades have conjectured that this is because estimation of VARMAs is perceived to be challenging and proposed various ways to simplify it. Nevertheless, VARMAs continue to be largely dominated by VARs, particularly in terms of developing useful extensions. We address these computational challenges with a Bayesian approach. Specifically, we develop a Gibbs sampler for the basic VARMA, and demonstrate how it can be extended to models with time-varying VMA coefficients and stochastic volatility. We illustrate the methodology through a macroeconomic forecasting exercise. We show that in a class of models with stochastic volatility, VARMAs produce better density forecasts than VARs, particularly for short forecast horizons.
    Keywords: state space, stochastic volatility, factor model, macroeconomic forecasting, density forecast
    JEL: C11 C32 C53
    Date: 2015–06
    URL: http://d.repec.org/n?u=RePEc:een:camaaa:2015-19&r=ecm
  14. By: Farbmacher, Helmut; Kögel, Heinrich (Munich Center for the Economics of Aging (MEA))
    Abstract: In the presence of heteroskedasticity, conventional standard errors (which assume homoskedasticity) can be biased up or down. The most common form of heteroskedasticity leads to conventional standard errors that are too small. When Wald tests based on these standard errors are insignificant, heteroskedasticity ro- bust standard errors do not change inference. On the other hand, inference is conservative in a setting with upward-biased conventional standard errors. We discuss the power gains when using robust standard errors in this case and also potential problems of heteroskedasticity tests. As a solution for the poor performance of the usual heteroskedasticity tests in this setting, we propose a modification of the White test which has better properties. We illustrate our findings using a study in labor economics. The correct standard errors turn out to be around 15 percent lower, leading to different policy conclusions. Moreover, only our modified test is able to detect heteroskedasticity in this application.
    Date: 2015–03–10
    URL: http://d.repec.org/n?u=RePEc:mea:meawpa:201503&r=ecm
  15. By: Daniel Felix Ahelegbey (Department of Economics, University of Venice Cà Foscari)
    Abstract: Recent advances in empirical finance has seen a growing interest in the application of network models to analyse contagion, spillover effects and risk propagation channels in the system. While interconnectivity among financial institutions have been widely studied, only a few papers review networks in finance and they do not focus on the econometrics aspects. This paper surveys the state of the arts for statistical inference and application of networks from a multidisciplinary perspective, and specifically in the context of systemic risk. We contribute to the literature on network econometrics by relating network models to multivariate analysis with potential applications in econometrics and finance.
    Keywords: Bayesian inference, Graphical models, Model selection, Systemic risk.
    JEL: C11 C15 C52 G01 G17
    Date: 2015
    URL: http://d.repec.org/n?u=RePEc:ven:wpaper:2015:13&r=ecm
  16. By: Laurent Callot (University of Amsterdam and CREATES); Johannes Tang Kristensen (University of Southern Denmark and CREATES)
    Abstract: This paper shows that the parsimoniously time-varying methodology of Callot and Kristensen (2015) can be applied to factormodels.We apply this method to study macroeconomic instability in the US from 1959:1 to 2006:4 with a particular focus on the Great Moderation. Models with parsimoniously time-varying parameters are models with an unknown number of break points at unknown locations. The parameters are assumed to follow a random walk with a positive probability that an increment is exactly equal to zero so that the parameters do not vary at every point in time. The vector of increments, which is high dimensional by construction and sparse by assumption, is estimated using the Lasso. We apply this method to the estimation of static factor models and factor augmented autoregressions using a set of 190 quarterly observations of 144 US macroeconomic series from Stock andWatson (2009).We find that the parameters of both models exhibit a higher degree of instability in the period from 1970:1 to 1984:4 relative to the following 15 years. In our setting the Great Moderation appears as the gradual ending of a period of high structural instability that took place in the 1970s and early 1980s.
    Keywords: Parsimoniously time-varying parameters, factor models, structural break, Lasso
    JEL: C01 C13 C32 C38 E32
    Date: 2015–06–01
    URL: http://d.repec.org/n?u=RePEc:aah:create:2015-29&r=ecm
  17. By: Nathaniel T. Wilcox (Economic Science Institute (Chapman University) and Center for the Economic Analysis of Risk (Georgia State University))
    Abstract: I compare the generalization ability, or out-of-sample predictive success, of four probabilistic models of binary discrete choice under risk. One model is the conventional homoscedastic latent index model—the simple logit—that is common in applied econometrics: This model is “context-free” in the sense that its error part is homoscedastic with respect to decision sets. The other three models are also latent index models but their error part is heteroscedastic with respect to decision sets: In that sense they are “context-dependent” models. Context-dependent models of choice under risk arise from several different theoretical perspectives. Here I consider my own “contextual utility” model (Wilcox 2011), the “decision field theory” model of Busemeyer and Townsend (1993) and the “Blavatskyy-Fishburn” model (Fishburn 1978; Blavatskyy 2014). In a new experiment, all three context-dependent models outperform the context-free model in prediction, and significantly outperform a linear probability model (suggested by contemporary applied practice a la Angrist and Pischke 2009) when the latent preference structure is rank-dependent utility (Quiggin 1982). All of this holds true for function-free estimations of outcome utilities and probability weights as well as parametric estimations. Preoccupation with theories of the deterministic structure of choice under risk, to the exclusion of theories of error, is a mistake.
    Keywords: risk, discrete choice, probabilistic choice, heteroscedasticity, prediction
    JEL: C25 C91 D81
    Date: 2015
    URL: http://d.repec.org/n?u=RePEc:chu:wpaper:15-11&r=ecm
  18. By: A. Sayers; J. Heron; A. Smith; C. Macdonald-Wallis; M. Gilthorpe; F. Steele; K. Tilling
    Abstract: There is a growing debate with regards to the appropriate methods of analysis of growth trajectories and their association with prospective dependent outcomes. Using the example of childhood growth and adult BP, we conducted an extensive simulation study to explore four two-stage and two joint modelling methods, and compared their bias and coverage in estimation of the (unconditional) association between birth length and later BP, and the association between growth rate and later BP (conditional on birth length). We show that the two-stage method of using multilevel models to estimate growth parameters and relating these to outcome gives unbiased estimates of the conditional associations between growth and outcome. Using simulations, we demonstrate that the simple methods resulted in bias in the presence of measurement error, as did the two-stage multilevel method when looking at the total (unconditional) association of birth length with outcome. The two joint modelling methods gave unbiased results, but using the re-inflated residuals led to undercoverage of the confidence intervals. We conclude that either joint modelling or the simpler two-stage multilevel approach can be used to estimate conditional associations between growth and later outcomes, but that only joint modelling is unbiased with nominal coverage for unconditional associations.
    Keywords: lifecourse epidemiology; joint model; multilevel model; measurement error; growth
    JEL: C1
    Date: 2014–09
    URL: http://d.repec.org/n?u=RePEc:ehl:lserod:62246&r=ecm
  19. By: Goh, Joel (Stanford University); Bayati, Mohsen (Stanford University); Zenios, Stefanos A. (Stanford University); Singh, Sundeep (Stanford University); Moore, David (Stanford University)
    Abstract: Cost-effectiveness studies of medical innovations often suffer from data inadequacy. When Markov chains are used as a modeling framework for such studies, this data inadequacy can manifest itself as imprecise estimates for many elements of the transition matrix. In this paper, we study how to compute maximal and minimal values for the discounted value of the chain (with respect to a vector of state-wise costs or rewards) as these uncertain transition parameters jointly vary within a given uncertainty set. We show that these problems are computationally tractable if the uncertainty set has a row-wise structure. Conversely, we prove that if the row-wise structure is relaxed slightly, the problems become computationally intractable (NP-hard). We apply our model to assess the cost-effectiveness of fecal immunochemical testing (FIT), a new screening method for colorectal cancer. Our results show that despite the large uncertainty in FIT's performance, it is highly cost-effective relative to the prevailing screening method of colonoscopy.
    Date: 2015
    URL: http://d.repec.org/n?u=RePEc:ecl:stabus:3283&r=ecm
  20. By: Chevillon, Guillaume (ESSEC Business School); Hecq , Alain (Maastricht University (Department of Quantitative Economics)); Laurent, Sébastien (Aix-Marseille University (Aix-Marseille School of Economics))
    Abstract: This paper shows that large dimensional vector autoregressive (VAR) models of fi nite order can generate long memory in the marginalized univariate series. We derive high-level assumptions under which the fi nal equation representation of a VAR(1) leads to univariate fractional white noises and verify the validity of these assumptions for two speci fic models. We consider the implications of our findings for the variances of asset returns where the so-called golden-rule of realized variances states that they tend always to exhibit fractional integration of a degree close to 0:4.
    Keywords: Long memory; Vector Autoregressive Model; Marginalization; Final Equation Representation; Volatility
    JEL: C10 C32 C58
    Date: 2015–06
    URL: http://d.repec.org/n?u=RePEc:ebg:essewp:dr-15007&r=ecm
  21. By: Christopher G. Gibbs (School of Economics, UNSW Business School, UNSW)
    Abstract: This paper proposes a new dynamic forecast combination strategy for forecasting inflation. The procedure draws on explanations of why the forecast combination puzzle exists and the stylized fact that Phillips curve forecasts of inflation exhibit significant time-variation in forecast accuracy. The forecast combination puzzle is the empirical observation that a simple average of point forecasts is often the best forecasting strategy. The forecast combination puzzle exists because many dynamic weighting strategies tend to shift weights toward Phillips curve forecasts after they exhibit a significant period of relative forecast improvement, which is often when their forecast accuracy begins to deteriorate. The proposed strategy in this paper weights forecasts according to their expected performance rather than their past performance to anticipate these changes in forecast accuracy. The forward-looking approach is shown to robustly beat equal weights combined and benchmark univariate forecasts of inflation in real-time out-of-sample exercises on U.S. and New Zealand inflation data.
    Keywords: Forecast combination, inflation, forecast pooling, forecast combination puzzle, Phillips curve
    JEL: E17 E47 C53
    Date: 2015–04
    URL: http://d.repec.org/n?u=RePEc:swe:wpaper:2015-09&r=ecm
  22. By: Christopher F Baum (Boston College; DIW Berlin); Hans Lööf (Royal Institute of Technology, Stockholm); Pardis Nabavi (Royal Institute of Technology, Stockholm); Andreas Stephan (Jönkoping International Business School)
    Abstract: We evaluate a Generalized Structural Equation Model (GSEM) approach to the estimation of the relationship between R&D, innovation and productivity that focuses on the potentially crucial heterogeneity across technology and knowledge levels. The model accounts for selectivity and handles the endogeneity of this relationship in a recursive framework. Employing a panel of Swedish firms observed in three consecutive Community Innovation Surveys, our maximum likelihood estimates show that many key channels of inuence among the model's components differ meaningfully in their statistical significance and magnitude across sectors defined by different technology levels.
    Keywords: R&D, Innovation, Productivity, Generalized Structural Equation Model, Community Innovation Survey
    JEL: C23 L6 O32 O52
    Date: 2015–05–29
    URL: http://d.repec.org/n?u=RePEc:boc:bocoec:876&r=ecm

This nep-ecm issue is ©2015 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.