nep-ecm New Economics Papers
on Econometrics
Issue of 2021‒10‒25
twenty-one papers chosen by
Sune Karlsson
Örebro universitet

  1. Modelling Time-Varying Volatility Interactions By Susana Campos-Martins; Cristina Amado
  2. Testing for long-range dependence in non-stationary time series time-varying regression By Lujia Bai; Weichi Wu
  3. Parameter Stability Testing for Multivariate Dynamic Time-Varying Models By Jiti Gao; Bin Peng; Yayi Yan
  4. Bi-integrative analysis of two-dimensional heterogeneous panel data model By Wei Wang; Xiaodong Yan; Yanyan Ren; Zhijie Xiao
  5. Robust Dynamic Panel Data Models Using 𝛆𝛆-Contamination By Badi H. Baltagi; Georges Bresson; Anoop Chaturvedi; Guy Lacroix
  6. Revisiting identification concepts in Bayesian analysis By Jean-Pierre Florens; Anna Simoni
  7. Exact Bias Correction for Linear Adjustment of Randomized Controlled Trials By Haoge Chang; Joel Middleton; Peter Aronow
  8. Robust Generalized Method of Moments: A Finite Sample Viewpoint By Dhruv Rohatgi; Vasilis Syrgkanis
  9. Two-stage least squares with a randomly right censored outcome By Jad Beyhum
  10. One Instrument to Rule Them All: The Bias and Coverage of Just-ID IV By Joshua Angrist; Michal Koles\'ar
  11. Conditional Heteroscedasticity Models with Time-Varying Parameters: Estimation and Asymptotics By Armin Pourkhanali; Jonathan Keith; Xibin Zhang
  12. High-dimensional Inference for Dynamic Treatment Effects By Jelena Bradic; Weijie Ji; Yuqian Zhang
  13. Difference-in-Differences with Geocoded Microdata By Kyle Butts
  14. On the asymptotic behavior of bubble date estimators By Eiji Kurozumi; Anton Skrobotov
  15. Time-varying granger causality tests for applications in global crude oil markets: A study on the DCC-MGARCH Hong test By Caporina, Massimiliano; Costola, Michele
  16. Attention Overload By Matias D. Cattaneo; Paul Cheung; Xinwei Ma; Yusufcan Masatlioglu
  17. Robust Inference for the Frisch Labor Supply Elasticity By Michael Keane; Timothy Neal
  18. A quantile based dimension reduction technique By Méndez Civieta, Álvaro; Aguilera Morillo, María del Carmen; Lillo Rodríguez, Rosa Elvira
  19. Adaptive Learning on Time Series: Method and Financial Applications By Parley Ruogu Yang; Ryan Lucas; Camilla Schelpe
  20. Robustifying Markowitz By Härdle, Wolfgang; Klochkov, Yegor; Petukhina, Alla; Zhivotovskiy, Nikita
  21. Faster fiscal stimulus and a higher government spending multiplier in China: Mixed-frequency identification with SVAR By Mingyang Li; Linlin Niu

  1. By: Susana Campos-Martins (University of Oxford, University of Minho and NIPE); Cristina Amado (University of Minho and NIPE, CREATES and Aarhus University)
    Abstract: In this paper, we propose an additive time-varying (or partially time-varying) multivariate model of volatility, where a time-dependent component is added to the extended vector GARCH process for modelling the dynamics of volatility interactions. In our framework, co-dependence in volatility is allowed to change smoothly between two extreme states and second-moment interdependence is identified from these crisis-contingent strucural changes. The estimation of the new time-varying vector GARCH process is simplified using an equation-by-equation estimator for the volatility equations in the first step, and estimating the correlation matrix in the second step. A new Lagrange multiplier test is derived for testing the null hypothesis of constancy co-dependence volatility against a smoothly time-varying interdependence between financial markets. The test appears to be a useful statistical tool for evaluating the adequacy of GARCH equations by testing the presence of significant changes in cross-market volatility transmissions. Monte Carlo simulation experiments show that the test statistic has satisfactory empirical properties in finite samples. An application to sovereign bond yield returns illustrates the modelling strategy of the new specification.
    Keywords: Multivariate time-varying GARCH; Volatility spillovers; Time-variation;Lagrange multiplier test; Financial market interdependence.
    JEL: C12 C13 C32 C51 G15
    Date: 2021
    URL: http://d.repec.org/n?u=RePEc:nip:nipewp:12/2021&r=
  2. By: Lujia Bai; Weichi Wu
    Abstract: We consider the problem of testing for long-range dependence for time-varying coefficient regression models. The covariates and errors are assumed to be locally stationary, which allows complex temporal dynamics and heteroscedasticity. We develop KPSS, R/S, V/S, and K/S-type statistics based on the nonparametric residuals, and propose bootstrap approaches equipped with a difference-based long-run covariance matrix estimator for practical implementation. Under the null hypothesis, the local alternatives as well as the fixed alternatives, we derive the limiting distributions of the test statistics, establish the uniform consistency of the difference-based long-run covariance estimator, and justify the bootstrap algorithms theoretically. In particular, the exact local asymptotic power of our testing procedure enjoys the order $O( \log^{-1} n)$, the same as that of the classical KPSS test for long memory in strictly stationary series without covariates. We demonstrate the effectiveness of our tests by extensive simulation studies. The proposed tests are applied to a COVID-19 dataset in favor of long-range dependence in the cumulative confirmed series of COVID-19 in several countries, and to the Hong Kong circulatory and respiratory dataset, identifying a new type of 'spurious long memory'.
    Date: 2021–10
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2110.08089&r=
  3. By: Jiti Gao; Bin Peng; Yayi Yan
    Abstract: Multivariate dynamic models are widely used in practical studies providing a tractable way to capture evolving interrelationships among multivariate time series, but not many studies focus on inferences. Along this line, a key question is that whether some coefficients (if not all) evolve with time. To settle this issue, the paper develops a Wald-type test statistic for detecting time-invariant parameters in a class of multivariate dynamic time-varying models. Since Gaussian/stationary approximation methods initially proposed for univariate time series settings are inapplicable to the setting under consideration in this paper, we develop an approximation method using a time-varying vector moving average infinity process. We show that the test statistic is asymptotically normal under both the null hypothesis and the local alternative. Simulation studies show that the proposed test has a desirable finite sample performance.
    Keywords: multivariate time series, parameter instability, specification testing, time-varying coefficient
    JEL: C12 C14 C32
    Date: 2021
    URL: http://d.repec.org/n?u=RePEc:msh:ebswps:2021-11&r=
  4. By: Wei Wang; Xiaodong Yan; Yanyan Ren; Zhijie Xiao
    Abstract: Heterogeneous panel data models that allow the coefficients to vary across individuals and/or change over time have received increasingly more attention in statistics and econometrics. This paper proposes a two-dimensional heterogeneous panel regression model that incorporate a group structure of individual heterogeneous effects with cohort formation for their time-variations, which allows common coefficients between nonadjacent time points. A bi-integrative procedure that detects the information regarding group and cohort patterns simultaneously via a doubly penalized least square with concave fused penalties is introduced. We use an alternating direction method of multipliers (ADMM) algorithm that automatically bi-integrates the two-dimensional heterogeneous panel data model pertaining to a common one. Consistency and asymptotic normality for the proposed estimators are developed. We show that the resulting estimators exhibit oracle properties, i.e., the proposed estimator is asymptotically equivalent to the oracle estimator obtained using the known group and cohort structures. Furthermore, the simulation studies provide supportive evidence that the proposed method has good finite sample performance. A real data empirical application has been provided to highlight the proposed method.
    Date: 2021–10
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2110.10480&r=
  5. By: Badi H. Baltagi (Center for Policy Research, Maxwell School, Syracuse University, 426 Eggers Hall, Syracuse, NY 13244); Georges Bresson (Department of Economics, Université Paris II, France); Anoop Chaturvedi (Department of Statistics, University of Allahabad, India); Guy Lacroix (Department D'économique, Université Laval, Québec, Canada)
    Abstract: This paper extends the work of Baltagi et al. (2018) to the popular dynamic panel data model. We investigate the robustness of Bayesian panel data models to possible misspecification of the prior distribution. The proposed robust Bayesian approach departs from the standard Bayesian framework in two ways. First, we consider the ε-contamination class of prior distributions for the model parameters as well as for the individual effects. Second, both the base elicited priors and the ε-contamination priors use Zellner (1986)'s g-priors for the variance-covariance matrices. We propose a general "toolbox" for a wide range of specifications which includes the dynamic panel model with random effects, with cross- correlated effects à la Chamberlain, for the Hausman-Taylor world and for dynamic panel data models with homogeneous/heterogeneous slopes and cross-sectional dependence. Using a Monte Carlo simulation study, we compare the finite sample properties of our proposed estimator to those of standard classical estimators. The paper contributes to the dynamic panel data literature by proposing a general robust Bayesian framework which encompasses the conventional frequentist specifications and their associated estimation methods as special cases.
    Keywords: Dynamic Model, ε-Contamination, g-Priors, Type-II Maximum Likelihood Posterior Density, Panel Data, Robust Bayesian Estimator, Two-Stage Hierarchy
    JEL: C11 C23 C26
    Date: 2021–10
    URL: http://d.repec.org/n?u=RePEc:max:cprwps:240&r=
  6. By: Jean-Pierre Florens; Anna Simoni
    Abstract: This paper studies the role played by identification in the Bayesian analysis of statistical and econometric models. First, for unidentified models we demonstrate that there are situations where the introduction of a non-degenerate prior distribution can make a parameter that is nonidentified in frequentist theory identified in Bayesian theory. In other situations, it is preferable to work with the unidentified model and construct a Markov Chain Monte Carlo (MCMC) algorithms for it instead of introducing identifying assumptions. Second, for partially identified models we demonstrate how to construct the prior and posterior distributions for the identified set parameter and how to conduct Bayesian analysis. Finally, for models that contain some parameters that are identified and others that are not we show that marginalizing out the identified parameter from the likelihood with respect to its conditional prior, given the nonidentified parameter, allows the data to be informative about the nonidentified and partially identified parameter. The paper provides examples and simulations that illustrate how to implement our techniques.
    Date: 2021–10
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2110.09954&r=
  7. By: Haoge Chang; Joel Middleton; Peter Aronow
    Abstract: In an influential critique of empirical practice, Freedman \cite{freedman2008A,freedman2008B} showed that the linear regression estimator was biased for the analysis of randomized controlled trials under the randomization model. Under Freedman's assumptions, we derive exact closed-form bias corrections for the linear regression estimator with and without treatment-by-covariate interactions. We show that the limiting distribution of the bias corrected estimator is identical to the uncorrected estimator, implying that the asymptotic gains from adjustment can be attained without introducing any risk of bias. Taken together with results from Lin \cite{lin2013agnostic}, our results show that Freedman's theoretical arguments against the use of regression adjustment can be completely resolved with minor modifications to practice.
    Date: 2021–10
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2110.08425&r=
  8. By: Dhruv Rohatgi; Vasilis Syrgkanis
    Abstract: For many inference problems in statistics and econometrics, the unknown parameter is identified by a set of moment conditions. A generic method of solving moment conditions is the Generalized Method of Moments (GMM). However, classical GMM estimation is potentially very sensitive to outliers. Robustified GMM estimators have been developed in the past, but suffer from several drawbacks: computational intractability, poor dimension-dependence, and no quantitative recovery guarantees in the presence of a constant fraction of outliers. In this work, we develop the first computationally efficient GMM estimator (under intuitive assumptions) that can tolerate a constant $\epsilon$ fraction of adversarially corrupted samples, and that has an $\ell_2$ recovery guarantee of $O(\sqrt{\epsilon})$. To achieve this, we draw upon and extend a recent line of work on algorithmic robust statistics for related but simpler problems such as mean estimation, linear regression and stochastic optimization. As two examples of the generality of our algorithm, we show how our estimation algorithm and assumptions apply to instrumental variables linear and logistic regression. Moreover, we experimentally validate that our estimator outperforms classical IV regression and two-stage Huber regression on synthetic and semi-synthetic datasets with corruption.
    Date: 2021–10
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2110.03070&r=
  9. By: Jad Beyhum
    Abstract: This note develops a simple two-stage least squares (2SLS) procedure to estimate the causal effect of some endogenous regressors on a randomly right censored outcome in the linear model. The proposal replaces the usual ordinary least squares regressions of the standard 2SLS by weighted least squares regressions. The weights correspond to the inverse probability of censoring. We show consistency and asymptotic normality of the estimator. The estimator exhibits good finite sample performances in simulations.
    Date: 2021–10
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2110.05107&r=
  10. By: Joshua Angrist; Michal Koles\'ar
    Abstract: Two-stage least squares estimates in heavily over-identified instrumental variables (IV) models can be misleadingly close to the corresponding ordinary least squares (OLS) estimates when many instruments are weak. Just-identified (just-ID) IV estimates using a single instrument are also biased, but the importance of weak-instrument bias in just-ID IV applications remains contentious. We argue that in microeconometric applications, just-ID IV estimators can typically be treated as all but unbiased and that the usual inference strategies are likely to be adequate. The argument begins with contour plots for confidence interval coverage as a function of instrument strength and explanatory variable endogeneity. These show undercoverage in excess of 5\% only for endogeneity beyond that seen even when IV and OLS estimates differ by an order of magnitude. Three widely-cited microeconometric applications are used to explain why endogeneity is likely low enough for IV estimates to be reliable. We then show that an estimator that's unbiased given a population first-stage sign restriction has bias exceeding that of IV when the restriction is imposed on the data. But screening on the sign of the estimated first stage is shown to halve the median bias of conventional IV without reducing coverage. To the extent that sign-screening is already part of empirical workflows, reported IV estimates enjoy the minimal bias of sign-screened just-ID IV
    Date: 2021–10
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2110.10556&r=
  11. By: Armin Pourkhanali; Jonathan Keith; Xibin Zhang
    Abstract: This paper proposes using Chebyshev polynomials to approximate time-varying parameters of a GARCH model, where polynomial coefficients are estimated via numerical optimization using the function gradient descent method. We investigate the asymptotic properties of the estimates of polynomial coefficients and the subsequent estimate of conditional variance. Monte Carlo studies are conducted to examine the performance of the proposed polynomial approximation. With empirical studies of modelling daily returns of the US 30-year T-bond daily closing price and daily returns of the gold futures closing price, we find that in terms of in-sample fitting and out-of-sample forecasting, our proposed time-varying model outperforms the constant-parameter counterpart and a benchmark time-varying model.
    Keywords: : Chebyshev polynomials, function gradient descent algorithm, loss function, one-day-ahead forecast
    JEL: C14 C58
    Date: 2021
    URL: http://d.repec.org/n?u=RePEc:msh:ebswps:2021-15&r=
  12. By: Jelena Bradic; Weijie Ji; Yuqian Zhang
    Abstract: This paper proposes a confidence interval construction for heterogeneous treatment effects in the context of multi-stage experiments with $N$ samples and high-dimensional, $d$, confounders. Our focus is on the case of $d\gg N$, but the results obtained also apply to low-dimensional cases. We showcase that the bias of regularized estimation, unavoidable in high-dimensional covariate spaces, is mitigated with a simple double-robust score. In this way, no additional bias removal is necessary, and we obtain root-$N$ inference results while allowing multi-stage interdependency of the treatments and covariates. Memoryless property is also not assumed; treatment can possibly depend on all previous treatment assignments and all previous multi-stage confounders. Our results rely on certain sparsity assumptions of the underlying dependencies. We discover new product rate conditions necessary for robust inference with dynamic treatments.
    Date: 2021–10
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2110.04924&r=
  13. By: Kyle Butts
    Abstract: This paper formalizes a common approach for estimating effects of treatment at a specific location using geocoded microdata. This estimator compares units immediately next to treatment (an inner-ring) to units just slightly further away (an outer-ring). I introduce intuitive assumptions needed to identify the average treatment effect among the affected units and illustrates pitfalls that occur when these assumptions fail. Since one of these assumptions requires knowledge of exactly how far treatment effects are experienced, I propose a new method that relaxes this assumption and allows for nonparametric estimation using partitioning-based least squares developed in Cattaneo et. al. (2019). Since treatment effects typically decay/change over distance, this estimator improves analysis by estimating a treatment effect curve as a function of distance from treatment. This is contrast to the traditional method which, at best, identifies the average effect of treatment. To illustrate the advantages of this method, I show that Linden and Rockoff (2008) under estimate the effects of increased crime risk on home values closest to the treatment and overestimate how far the effects extend by selecting a treatment ring that is too wide.
    Date: 2021–10
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2110.10192&r=
  14. By: Eiji Kurozumi; Anton Skrobotov
    Abstract: In this study, we extend the three-regime bubble model of Pang et al. (2021) to allow the forth regime followed by the unit root process after recovery. We provide the asymptotic and finite sample justification of the consistency of the collapse date estimator in the two-regime AR(1) model. The consistency allows us to split the sample before and after the date of collapse and to consider the estimation of the date of exuberation and date of recovery separately. We have also found that the limiting behavior of the recovery date varies depending on the extent of explosiveness and recovering.
    Date: 2021–10
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2110.04500&r=
  15. By: Caporina, Massimiliano; Costola, Michele
    Abstract: Analysing causality among oil prices and, in general, among financial and economic variables is of central relevance in applied economics studies. The recent contribution of Lu et al. (2014) proposes a novel test for causality- the DCC-MGARCH Hong test. We show that the critical values of the test statistic must be evaluated through simulations, thereby challenging the evidence in papers adopting the DCC-MGARCH Hong test. We also note that rolling Hong tests represent a more viable solution in the presence of short-lived causality periods.
    Keywords: Granger Causality,Hong test,DCC-GARCH,Oil market,COVID-19
    JEL: C10 C13 C32 C58 Q43 Q47
    Date: 2021
    URL: http://d.repec.org/n?u=RePEc:zbw:safewp:324&r=
  16. By: Matias D. Cattaneo; Paul Cheung; Xinwei Ma; Yusufcan Masatlioglu
    Abstract: We introduce an Attention Overload Model that captures the idea that alternatives compete for the decision maker's attention, and hence the attention frequency each alternative receives decreases as the choice problem becomes larger. Using this nonparametric restriction on the random attention formation, we show that a fruitful revealed preference theory can be developed, and provide testable implications on the observed choice behavior that can be used to partially identify the decision maker's preference. Furthermore, we provide novel partial identification results on the underlying attention frequency, thereby offering the first nonparametric identification result of (a feature of) the random attention formation mechanism in the literature. Building on our partial identification results, for both preferences and attention frequency, we develop econometric methods for estimation and inference. Importantly, our econometric procedures remain valid even in settings with large number of alternatives and choice problems, an important feature of the economic environment we consider. We also provide a software package in R implementing our empirical methods, and illustrate them in a simulation study.
    Date: 2021–10
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2110.10650&r=
  17. By: Michael Keane (School of Economics); Timothy Neal (UNSW School of Economics)
    Abstract: There is a long standing controversy over the magnitude of the Frisch labor supply elasticity. Macro economists using DSGE models often calibrate it to be large, while many micro data studies find it is small. Several papers attempt to reconcile the micro and macro results. We offer a new and simple explanation: Most micro studies estimate the Frisch using a 2SLS regression of hours changes on income changes. But available instruments are typically "weak." In that case, we show it is an inherent property of 2SLS that estimates of the Frisch will (spuriously) appear more precise when they are more shifted in the direction of the OLS bias, which is negative. As a result, Frisch elasticities near zero will (spuriously) appear to be precisely estimated, while large estimates will appear to be imprecise. This pattern makes it difficult for a 2SLS t-test to detect a true positive Frisch elasticity. We show how the use of a weak instrument robust hypothesis test, the Anderson-Rubin (AR) test, leads us to conclude the Frisch elasticity is large and signiï¬ cant in the NLSY97 data. In contrast, a conventional 2SLS t-test would lead us to conclude it is not significantly different from zero. Our application illustrates a fundamental problem with 2SLS t-tests that arises quite generally, even with strong instruments. Thus, we argue the AR test should be widely adopted in lieu of the t-test.
    Keywords: Frisch elasticity, labor supply, weak instruments, 2SLS, Anderson-Rubin test
    Date: 2021–10
    URL: http://d.repec.org/n?u=RePEc:swe:wpaper:2021-07b&r=
  18. By: Méndez Civieta, Álvaro; Aguilera Morillo, María del Carmen; Lillo Rodríguez, Rosa Elvira
    Abstract: Partial least squares (PLS) is a dimensionality reduction technique used as an alternative to ordinary least squares (OLS) in situations where the data is colinear or high dimensional. Both PLS and OLS provide mean based estimates, which are extremely sensitive to the presence of outliers or heavy tailed distributions. In contrast, quantile regression is an alternative to OLS that computes robust quantile based estimates. In this work, the multivariate PLS is extended to the quantile regression framework, obtaining a theoretical formulation of the problem and a robust dimensionality reduction technique that we call fast partial quantile regression (fPQR), that provides quantilebased estimates. An efficient implementation of fPQR is also derived, and its performance is studied through simulation experiments and the chemometrics well known biscuit dough dataset, a real high dimensional example.
    Keywords: Partial-Least-Squares; Quantile-Regression; Dimension-Reduction; Outliers; Robust
    Date: 2021–10–18
    URL: http://d.repec.org/n?u=RePEc:cte:wsrepe:33469&r=
  19. By: Parley Ruogu Yang; Ryan Lucas; Camilla Schelpe
    Abstract: We formally introduce a time series statistical learning method, called Adaptive Learning, capable of handling model selection, out-of-sample forecasting and interpretation in a noisy environment. Through simulation studies we demonstrate that the method can outperform traditional model selection techniques such as AIC and BIC in the presence of regime-switching, as well as facilitating window size determination when the Data Generating Process is time-varying. Empirically, we use the method to forecast S&P 500 returns across multiple forecast horizons, employing information from the VIX Curve and the Yield Curve. We find that Adaptive Learning models are generally on par with, if not better than, the best of the parametric models a posteriori, evaluated in terms of MSE, while also outperforming under cross validation. We present a financial application of the learning results and an interpretation of the learning regime during the 2020 market crash. These studies can be extended in both a statistical direction and in terms of financial applications.
    Date: 2021–10
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2110.11156&r=
  20. By: Härdle, Wolfgang; Klochkov, Yegor; Petukhina, Alla; Zhivotovskiy, Nikita
    Abstract: Markowitz mean-variance portfolios with sample mean and covariance as input parameters feature numerous issues in practice. They perform poorly out of sample due to estimation error, they experience extreme weights together with high sen- sitivity to change in input parameters. The heavy-tail characteristics of financial time series are in fact the cause for these erratic fluctuations of weights that conse- quently create substantial transaction costs. In robustifying the weights we present a toolbox for stabilizing costs and weights for global minimum Markowitz portfolios. Utilizing a projected gradient descent (PGD) technique, we avoid the estimation and inversion of the covariance operator as a whole and concentrate on robust estimation of the gradient descent increment. Using modern tools of robust statistics we con- struct a computationally efficient estimator with almost Gaussian properties based on median-of-means uniformly over weights. This robustified Markowitz approach is confirmed by empirical studies on equity markets. We demonstrate that robustified portfolios reach higher risk-adjusted performance and the lowest turnover compared to shrinkage based and constrained portfolios.
    Date: 2021
    URL: http://d.repec.org/n?u=RePEc:zbw:irtgdp:2021018&r=
  21. By: Mingyang Li; Linlin Niu
    Abstract: Motivating with two scenarios in which the government spending in China timely reacted to output shock within a quarter, this letter points out a downward bias in the estimation of Chinese government spending multiplier using the classical lag restriction for shock identification in a quarterly SVAR framework à la Blanchard and Perotti (2002). By relaxing the lag-length restriction from one quarter to one month, we propose a mixed-frequency identification (MFI) strategy by taking the unexpected spending change in the first month of each quarter as an instrument. The estimation results show that the Chinese government significantly reacts to output shock counter-cyclically within a quarter, with the resulting government spending multiplier being 0.546 on impact and 1.849 at the maximum. A comparison study confirms that results based on the identification strategy of Blanchard and Perotti (2002) suffer severe downward bias in such a case.
    Keywords: government spending multiplier; inside lag; mixed-frequency identification; SVAR model.
    JEL: C32 C36 E23 E62
    Date: 2021–10–19
    URL: http://d.repec.org/n?u=RePEc:wyi:wpaper:002594&r=

This nep-ecm issue is ©2021 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.