nep-ecm New Economics Papers
on Econometrics
Issue of 2021‒03‒22
nineteen papers chosen by
Sune Karlsson
Örebro universitet

  1. Discrete Mixtures of Normals Pseudo Maximum Likelihood Estimators of Structural Vector Autoregressions By Gabriele Fiorentini; Enrique Sentana
  2. Tail forecasts of inflation using time-varying parameter quantile regressions By Michael Pfarrhofer
  3. Markov Switching Panel with Endogenous Synchronization Effects By Komla M. Agudze; Monica Billio; Roberto Casarin; Francesco Ravazzolo
  4. Seasonality in High Frequency Time Series By Tommaso Proietti; Diego J. Pedregal
  5. Approximate Bayesian inference and forecasting in huge-dimensional multi-country VARs By Martin Feldkircher; Florian Huber; Gary Koop; Michael Pfarrhofer
  6. On a log-symmetric quantile tobit model applied to female labor supply data By Dan\'ubia R. Cunha; Jose A. Divino; Helton Saulo
  7. Extremal points of Lorenz curves and applications to inequality analysis By Amparo Ba\'illo; Javier C\'arcamo; Carlos Mora-Corral
  8. IV Regression with Possibly Uncorrelated Instruments By Emmanuel Selorm Tsyawo
  9. Regime switching models for directional and linear observations By Harvey, A.; Palumbo, D.
  10. Multivariate tail covariance for generalized skew-elliptical distributions By Baishuai Zuo; Chuancun Yin
  11. Repenser le modèle à correction d’erreurs dans l’analyse macroéconométrique : Une revue By PINSHI, Christian P.
  12. Identifying high-frequency shocks with Bayesian mixed-frequency VARs By Alessia Paccagnini; Fabio Parla
  13. Causal Reinforcement Learning: An Instrumental Variable Approach By Jin Li; Ye Luo; Xiaowei Zhang
  14. DoubleML -- An Object-Oriented Implementation of Double Machine Learning in R By Philipp Bach; Victor Chernozhukov; Malte S. Kurz; Martin Spindler
  15. LATE Estimators under Costly Non-compliance in Student-College Matching Markets By Marin Drlje; Stepan Jurajda
  16. Dynamic Econometrics in Action: A Biography of David F. Hendry By Neil R. Ericsson
  17. Optimizing Expected Shortfall under an $\ell_1$ constraint -- an analytic approach By G\'abor Papp; Imre Kondor; Fabio Caccioli
  18. Modelling Volatility Cycles: The (MF)2 GARCH Model By Christian Conrad; Robert F. Engle
  19. Simultaneous Decorrelation of Matrix Time Series By Yuefeng Han; Rong Chen; Cun-Hui Zhang; Qiwei Yao

  1. By: Gabriele Fiorentini (Università di Firenze and RCEA); Enrique Sentana (CEMFI, Centro de Estudios Monetarios y Financieros)
    Abstract: Likelihood inference in structural vector autoregressions with independent non-Gaussian shocks leads to parametric identification and efficient estimation at the risk of inconsistencies under distributional misspecification. We prove that autoregressive coefficients and (scaled) impact multipliers remain consistent, but the drifts and standard deviations of the shocks are generally inconsistent. Nevertheless, we show consistency when the non-Gaussian log-likelihood is a discrete scale mixture of normals in the symmetric case, or an unrestricted finite mixture more generally. Our simulation exercises compare the efficiency of these estimators to other consistent proposals. Finally, our empirical application looks at dynamic linkages between three popular volatility indices.
    Keywords: Consistency, finite normal mixtures, pseudo maximum likelihood estimators, structural models, volatility indices.
    JEL: C32 C46 C51 C58
    Date: 2020–10
    URL: http://d.repec.org/n?u=RePEc:cmf:wpaper:wp2020_2023&r=all
  2. By: Michael Pfarrhofer
    Abstract: This paper proposes methods for Bayesian inference in time-varying parameter (TVP) quantile regression (QR) models. We use data augmentation schemes to facilitate the conditional likelihood, and render the model conditionally Gaussian to develop an efficient Gibbs sampling algorithm. Regularization of the high-dimensional parameter space is achieved via flexible dynamic shrinkage priors. A simple version of the TVP-QR based on an unobserved component (UC) model is applied to dynamically trace the quantiles of the distribution of inflation in the United States (US), the United Kingdom (UK) and the euro area (EA). We conduct an out-of-sample inflation forecasting exercise to assess predictive accuracy of the proposed framework versus several benchmarks using metrics to capture performance in different parts of the distribution. The proposed model is competitive and performs particularly well for higher-order and tail forecasts. We analyze the resulting predictive distributions and find that they are often skewed and feature heavier than normal tails.
    Date: 2021–03
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2103.03632&r=all
  3. By: Komla M. Agudze (World Bank Group, Washington, US); Monica Billio (Ca' Foscari University of Venice, Italy); Roberto Casarin (Ca' Foscari University of Venice, Italy); Francesco Ravazzolo (Free University of Bozen-Bolzano, Italy; BI Norwegian Business School, Norway)
    Abstract: This paper introduces a new dynamic panel model with multi-layer network effects. Series-specific latent Markov chain processes drive the dynamics of the observable processes, and several types of interaction effects among the hidden chains allow for various degrees of endogenous synchronization of both latent and observable processes. The interaction is driven by a multi-layer network with exogenous and endogenous connectivity layers. We provide some theoretical properties of the model, develop a Bayesian inference framework and an efficient Markov Chain Monte Carlo algorithm for estimating parameters, latent states, and endogenous network layers. An application to the US-state coincident indicators shows that the synchronization in the US economy is generated by network effects among the states. The inclusion of a multi-layer network provides a new tool for measuring the effects of the public policies that impact the connectivity between the US states, such as mobility restrictions or job support schemes. The proposed new model and the related inference are general and may find application in a wide spectrum of datasets where the extraction of endogenous interaction effects is relevant and of interest.
    Keywords: Bayesian inference; interacting Markov chains; multi-layer networks; panel Markov-switching.
    JEL: C11 C13 C15 C23 C55
    Date: 2021–03
    URL: http://d.repec.org/n?u=RePEc:bzn:wpaper:bemps82&r=all
  4. By: Tommaso Proietti (DEF & CEIS,Università di Roma "Tor Vergata"); Diego J. Pedregal (Universidad de Castilla-La Mancha)
    Abstract: Time series observed at higher frequencies than monthly frequency display complex seasonal patterns that result from the combination of multiple seasonal patterns (with annual, monthly, weekly and daily periodicities) and varying periods, due to the irregularity of the calendar. The paper deals with modelling seasonality in high frequency data from two main perspectives: the stochastic harmonic approach, based on the Fourier representation of a periodic function, and the time-domain random effects approach. An encompassing representation illustrates the conditions under which they are equivalent. Three major challenges are considered: the first deals with modelling the effect of moving festivals, holidays and other breaks due to the calendar. Secondly, robust estimation and filtering methods are needed to tackle the level of outlier contamination, which is typically high, due to the lower level of temporal aggregation and the raw nature of the data. Finally, we focus on model selection strategies, which are important, as the number of harmonic or random components that are needed to account for the complexity of seasonality can be very large.
    Keywords: State Space Models. Robust filtering. Seasonal Adjustment. Variable selection
    JEL: C22 C52 C58
    Date: 2021–03–11
    URL: http://d.repec.org/n?u=RePEc:rtv:ceisrp:508&r=all
  5. By: Martin Feldkircher; Florian Huber; Gary Koop; Michael Pfarrhofer
    Abstract: The Panel Vector Autoregressive (PVAR) model is a popular tool for macroeconomic forecasting and structural analysis in multi-country applications since it allows for spillovers between countries in a very flexible fashion. However, this flexibility means that the number of parameters to be estimated can be enormous leading to over-parameterization concerns. Bayesian global-local shrinkage priors, such as the Horseshoe prior used in this paper, can overcome these concerns, but they require the use of Markov Chain Monte Carlo (MCMC) methods rendering them computationally infeasible in high dimensions. In this paper, we develop computationally efficient Bayesian methods for estimating PVARs using an integrated rotated Gaussian approximation (IRGA). This exploits the fact that whereas own country information is often important in PVARs, information on other countries is often unimportant. Using an IRGA, we split the the posterior into two parts: one involving own country coefficients, the other involving other country coefficients. Fast methods such as approximate message passing or variational Bayes can be used on the latter and, conditional on these, the former are estimated with precision using MCMC methods. In a forecasting exercise involving PVARs with up to $18$ variables for each of $38$ countries, we demonstrate that our methods produce good forecasts quickly.
    Date: 2021–03
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2103.04944&r=all
  6. By: Dan\'ubia R. Cunha; Jose A. Divino; Helton Saulo
    Abstract: The classic censored regression model (tobit model) has been widely used in the economic literature. This model assumes normality for the error distribution and is not recommended for cases where positive skewness is present. Moreover, in regression analysis, it is well-known that a quantile regression approach allows us to study the influences of the explanatory variables on the dependent variable considering different quantiles. Therefore, we propose in this paper a quantile tobit regression model based on quantile-based log-symmetric distributions. The proposed methodology allows us to model data with positive skewness (which is not suitable for the classic tobit model), and to study the influence of the quantiles of interest, in addition to accommodating heteroscedasticity. The model parameters are estimated using the maximum likelihood method and an elaborate Monte Carlo study is performed to evaluate the performance of the estimates. Finally, the proposed methodology is illustrated using two female labor supply data sets. The results show that the proposed log-symmetric quantile tobit model has a better fit than the classic tobit model.
    Date: 2021–03
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2103.04449&r=all
  7. By: Amparo Ba\'illo; Javier C\'arcamo; Carlos Mora-Corral
    Abstract: We find the set of extremal points of Lorenz curves with fixed Gini index and compute the maximal $L^1$-distance between Lorenz curves with given values of their Gini coefficients. As an application we introduce a bidimensional index that simultaneously measures relative inequality and dissimilarity between two populations. This proposal employs the Gini indices of the variables and an $L^1$-distance between their Lorenz curves. The index takes values in a right-angled triangle, two of whose sides characterize perfect relative inequality-expressed by the Lorenz ordering between the underlying distributions. Further, the hypotenuse represents maximal distance between the two distributions. As a consequence, we construct a chart to, graphically, either see the evolution of (relative) inequality and distance between two income distributions over time or to compare the distribution of income of a specific population between a fixed time point and a range of years. We prove the mathematical results behind the above claims and provide a full description of the asymptotic properties of the plug-in estimator of this index. Finally, we apply the proposed bidimensional index to several real EU-SILC income datasets to illustrate its performance in practice.
    Date: 2021–03
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2103.03286&r=all
  8. By: Emmanuel Selorm Tsyawo
    Abstract: This paper proposes a closed-form linear IV estimator which allows endogenous covariates to be weakly correlated or un-correlated but mean-dependent on instruments. Identification rests on (1) a weak uncorrelatedness exclusion restriction and (2) a weak relevance condition where covariates are mean-dependent on instruments. The significant weakening of the relevance condition does not come at the cost of a stronger exclusion restriction. The estimator is root-n-consistent and asymptotically normal. Monte Carlo simulations show the estimator exploits unknown forms of both monotone and non-monotone identifying variation equally well, and it incurs less bias and size distortion relative to conventional IV methods when instruments are weak. An empirical example illustrates the practical usefulness of the estimator.
    Date: 2021–03
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2103.09621&r=all
  9. By: Harvey, A.; Palumbo, D.
    Abstract: The score-driven approach to time series modeling provides a solution to the problem of modeling circular data and it can also be used to model switching regimes with intra-regime dynamics. Furthermore it enables a dynamic model to be fitted to a linear and a circular variable when the joint distribution is a cylinder. The viability of the new method is illustrated by estimating a model with dynamic switching and dynamic location and/or scale in each regime to hourly data on wind direction and speed in Galicia, north-west Spain.
    Keywords: Circular data, conditional score, cylinder, hidden Markov model, von Mises distribution, wind.
    JEL: C22 C32
    Date: 2021–03–10
    URL: http://d.repec.org/n?u=RePEc:cam:camdae:2123&r=all
  10. By: Baishuai Zuo; Chuancun Yin
    Abstract: In this paper, the multivariate tail covariance (MTCov) for generalized skew-elliptical distributions is considered. Some special cases for this distribution, such as generalized skew-normal, generalized skew student-t, generalized skew-logistic and generalized skew-Laplace distributions, are also considered. In order to test the theoretical feasibility of our results, the MTCov for skewed and non skewed normal distributions are computed and compared. Finally, we give a special formula of the MTCov for generalized skew-elliptical distributions.
    Date: 2021–03
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2103.05201&r=all
  11. By: PINSHI, Christian P.
    Abstract: The methodology of cointegration filled the void that existed between economic theorists and econometricians in understanding the dynamics, equilibrium and reliability bias of macroeconomic and financial analysis, which is subject to revision non-stationary behavior. This article provides a relevant review of the power of the error correction model. Theorists and econometricians have shown that the error correction model is a powerful machine that offers macroeconomic policy refinement of econometric results.
    Keywords: Cointegration, Error correction model, Inflation, Exchange rate
    JEL: C18 C32 E52 E60 F41
    Date: 2021–03
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:106694&r=all
  12. By: Alessia Paccagnini; Fabio Parla
    Abstract: We contribute to research on mixed-frequency regressions by introducing an innovative Bayesian approach. Based on a new “high-frequency” identification scheme, we provide novel empirical evidence of identifying uncertainty shock for the US economy. As main findings, we document a “temporal aggregation bias” when we adopt a common low frequency model instead of estimating a mixed-frequency framework. The bias is amplified when we identify a higher frequency shock.
    Keywords: Bayesian mixed-frequency VAR, MIDAS, uncertainty shocks, macro-financial linkages
    JEL: C32 E44 E52
    Date: 2021–02
    URL: http://d.repec.org/n?u=RePEc:een:camaaa:2021-26&r=all
  13. By: Jin Li; Ye Luo; Xiaowei Zhang
    Abstract: In the standard data analysis framework, data is first collected (once for all), and then data analysis is carried out. With the advancement of digital technology, decisionmakers constantly analyze past data and generate new data through the decisions they make. In this paper, we model this as a Markov decision process and show that the dynamic interaction between data generation and data analysis leads to a new type of bias -- reinforcement bias -- that exacerbates the endogeneity problem in standard data analysis. We propose a class of instrument variable (IV)-based reinforcement learning (RL) algorithms to correct for the bias and establish their asymptotic properties by incorporating them into a two-timescale stochastic approximation framework. A key contribution of the paper is the development of new techniques that allow for the analysis of the algorithms in general settings where noises feature time-dependency. We use the techniques to derive sharper results on finite-time trajectory stability bounds: with a polynomial rate, the entire future trajectory of the iterates from the algorithm fall within a ball that is centered at the true parameter and is shrinking at a (different) polynomial rate. We also use the technique to provide formulas for inferences that are rarely done for RL algorithms. These formulas highlight how the strength of the IV and the degree of the noise's time dependency affect the inference.
    Date: 2021–03
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2103.04021&r=all
  14. By: Philipp Bach; Victor Chernozhukov; Malte S. Kurz; Martin Spindler
    Abstract: The R package DoubleML implements the double/debiased machine learning framework of Chernozhukov et al. (2018). It provides functionalities to estimate parameters in causal models based on machine learning methods. The double machine learning framework consist of three key ingredients: Neyman orthogonality, high-quality machine learning estimation and sample splitting. Estimation of nuisance components can be performed by various state-of-the-art machine learning methods that are available in the mlr3 ecosystem. DoubleML makes it possible to perform inference in a variety of causal models, including partially linear and interactive regression models and their extensions to instrumental variable estimation. The object-oriented implementation of DoubleML enables a high flexibility for the model specification and makes it easily extendable. This paper serves as an introduction to the double machine learning framework and the R package DoubleML. In reproducible code examples with simulated and real data sets, we demonstrate how DoubleML users can perform valid inference based on machine learning methods.
    Date: 2021–03
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2103.09603&r=all
  15. By: Marin Drlje; Stepan Jurajda
    Abstract: A growing literature exploits a feature of centralized college admission systems where students with similar admission scores in a neighborhood of a school’s admission threshold are or are not offered admission based on small quasi-random differences in admission scores. Assuming that the students at the margin of admission differ only in the treatment assignment, this literature relies on admission scores to instrument for admission or graduation. We point out that non-compliance with the centralized matching assignment typically corresponds to enrolling in one’s preferred program a year after the initial assignment, introducing significant non-compliance costs. We show that with costly non-compliance, the exclusion restriction, the key assumption of the LATE theorem, is violated, leading to biased estimates when instrumenting for graduation, i.e., for a treatment taking place after non-compliance costs are incurred. We use data from a student-college matching market in Croatia to illustrate the empirical importance of this potential source of bias and propose a method inspired by Lee (2009), which recovers the treatment effect bounds under the assumption that the costs of non-compliance are not related to the treatment assignment.
    Date: 2021–02
    URL: http://d.repec.org/n?u=RePEc:cer:papers:wp686&r=all
  16. By: Neil R. Ericsson (Division of International Finance, Board of Governors of the Federal Reserve System)
    Abstract: David Hendry has made-and continues to make-pivotal contributions to the econometrics of empirical economic modeling, economic forecasting, econometrics software, substantive empirical economic model design, and economic policy. This paper reviews his contributions by topic, emphasizing the overlaps between different strands in his research and the importance of real-world problems in motivating that research.
    Keywords: cointegration, consumers' expenditure, dynamic specification, equilibrium correction, forecasting, machine learning, model evaluation, money demand, PcGive, structural breaks
    JEL: C52 C53
    Date: 2021–03
    URL: http://d.repec.org/n?u=RePEc:gwc:wpaper:2021-001&r=all
  17. By: G\'abor Papp; Imre Kondor; Fabio Caccioli
    Abstract: Expected Shortfall (ES), the average loss above a high quantile, is the current financial regulatory market risk measure. Its estimation and optimization are highly unstable against sample fluctuations and become impossible above a critical ratio $r=N/T$, where $N$ is the number of different assets in the portfolio, and $T$ is the length of the available time series. The critical ratio depends on the confidence level $\alpha$, which means we have a line of critical points on the $\alpha-r$ plane. The large fluctuations in the estimation of ES can be attenuated by the application of regularizers. In this paper, we calculate ES analytically under an $\ell_1$ regularizer by the method of replicas borrowed from the statistical physics of random systems. The ban on short selling, i.e. a constraint rendering all the portfolio weights non-negative, is a special case of an asymmetric $\ell_1$ regularizer. Results are presented for the out-of-sample and the in-sample estimator of the regularized ES, the estimation error, the distribution of the optimal portfolio weights and the density of the assets eliminated from the portfolio by the regularizer. It is shown that the no-short constraint acts as a high volatility cutoff, in the sense that it sets the weights of the high volatility elements to zero with higher probability than those of the low volatility items. This cutoff renormalizes the aspect ratio $r=N/T$, thereby extending the range of the feasibility of optimization. We find that there is a nontrivial mapping between the regularized and unregularized problems, corresponding to a renormalization of the order parameters.
    Date: 2021–03
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2103.04375&r=all
  18. By: Christian Conrad (Department of Economics, Heidelberg University, Germany; KOF Swiss Economic Institute, Switzerland; Rimini Centre for Economic Analysis); Robert F. Engle (New York University, Stern School of Business, USA; Rimini Centre for Economic Analysis)
    Abstract: We suggest a multiplicative factor multi frequency component GARCH model which exploits the empirical fact that the daily standardized forecast errors of standard GARCH models behave counter-cyclical when averaged at a lower frequency. For the new model, we derive the unconditional variance of the returns, the news impact function and multi-step-ahead volatility forecasts. We apply the model to the S&P 500, the FTSE 100 and the Hang Seng Index. We show that the long-term component of stock market volatility is driven by news about the macroeconomic outlook and monetary policy as well as policy-related news. The new component model significantly outperforms the nested one-component (GJR) GARCH and several HAR-type models in terms of out-of-sample forecasting.
    Keywords: Volatility forecasting, long- and short-term volatility, mixed frequency data, volatility cycles
    JEL: C53 C58 G12
    Date: 2021–03
    URL: http://d.repec.org/n?u=RePEc:rim:rimwps:21-05&r=all
  19. By: Yuefeng Han; Rong Chen; Cun-Hui Zhang; Qiwei Yao
    Abstract: We propose a contemporaneous bilinear transformation for matrix time series to alleviate the difficulties in modelling and forecasting large number of time series together. More precisely the transformed matrix splits into several small matrices, and those small matrix series are uncorrelated across all times. Hence an effective dimension-reduction is achieved by modelling each of those small matrix series separately without the loss of information on the overall linear dynamics. We adopt the bilinear transformation such that the rows and the columns of the matrix do not mix together, as they typically represent radically different features. As the targeted transformation is not unique, we identify an ideal version through a new normalization, which facilitates the no-cancellation accumulation of the information from different time lags. The non-asymptotic error bounds of the estimated transformation are derived, leading to the uniform convergence rates of the estimation. The proposed method is illustrated numerically via both simulated and real data examples.
    Date: 2021–03
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2103.09411&r=all

This nep-ecm issue is ©2021 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.