nep-ets New Economics Papers
on Econometric Time Series
Issue of 2021‒09‒06
six papers chosen by
Jaqueson K. Galimberti
Auckland University of Technology

  1. Exploring volatility of crude oil intra-day return curves: a functional GARCH-X Model By Rice, Gregory; Wirjanto, Tony; Zhao, Yuqian
  2. Robust Bayesian Analysis for Econometrics By Raffaella Giacomini; Toru Kitagawa; Matthew Read
  3. Extreme Conditional Expectile Estimation in Heavy-Tailed Heteroscedastic Regression Models By Stéphane Girard; Gilles Claude Stupfler; Antoine Usseglio-Carleve
  4. Bilinear Input Normalization for Neural Networks in Financial Forecasting By Dat Thanh Tran; Juho Kanniainen; Moncef Gabbouj; Alexandros Iosifidis
  5. Income Business Cycles By Geraldine Dany-Knedlik; Alexander Kriwoluzky; Sandra Pasch
  6. UNCERTAINTY AND MONETARY POLICY DURING THE GREAT RECESSION By Giovanni Pellegrino; Efrem Castelnuovo; Giovanni Caggiano

  1. By: Rice, Gregory; Wirjanto, Tony; Zhao, Yuqian
    Abstract: Crude oil intra-day return curves collected from the commodity futures market often appear to be serially uncorrelated and long-range dependent. Existing functional GARCH models, while able to accommodate short range conditional heteroscedasticity, are not designed to capture long-range dependence. We propose and study a new functional GARCH-X model for this purpose, where the covariate X is chosen to be weakly stationary and long-range dependent. Functional analogs of autocorrelation coefficients of squared processes for this model are derived, and compared to those estimated from crude oil return curves. The results show that the FGARCH-X model provides a significant correction to existing functional volatility models in terms of an in-sample fitting, while its out-of-sample performances do not appear to be more superior than those of the existing functional GARCH models.
    Keywords: Crude oil intra-day return curves, volatility modeling and forecasting, functional GARCH-X model, long-range dependence, basis selection
    JEL: C13 C32 C58 G10 G17
    Date: 2021–08–18
  2. By: Raffaella Giacomini; Toru Kitagawa; Matthew Read
    Abstract: We review the literature on robust Bayesian analysis as a tool for global sensitivity analysis and for statistical decision-making under ambiguity. We discuss the methods proposed in the literature, including the different ways of constructing the set of priors that are the key input of the robust Bayesian analysis. We consider both a general set-up for Bayesian statistical decisions and inference and the special case of set-identified structural models. We provide new results that can be used to derive and compute the set of posterior moments for sensitivity analysis and to compute the optimal statistical decision under multiple priors. The paper ends with a self-contained discussion of three different approaches to robust Bayesian inference for set-identified structural vector autoregressions, including details about numerical implementation and an empirical illustration.
    Keywords: ambiguity; Bayesian robustness; statistical decision theory; identifying restrictions; multiple priors; structural vector autoregression
    JEL: C11 C18 C52
    Date: 2021–08–23
  3. By: Stéphane Girard (LJK - Laboratoire Jean Kuntzmann - Inria - Institut National de Recherche en Informatique et en Automatique - CNRS - Centre National de la Recherche Scientifique - UGA - Université Grenoble Alpes - Grenoble INP - Institut polytechnique de Grenoble - Grenoble Institute of Technology - UGA - Université Grenoble Alpes, STATIFY - Modèles statistiques bayésiens et des valeurs extrêmes pour données structurées et de grande dimension - Inria Grenoble - Rhône-Alpes - Inria - Institut National de Recherche en Informatique et en Automatique - LJK - Laboratoire Jean Kuntzmann - Inria - Institut National de Recherche en Informatique et en Automatique - CNRS - Centre National de la Recherche Scientifique - UGA - Université Grenoble Alpes - Grenoble INP - Institut polytechnique de Grenoble - Grenoble Institute of Technology - UGA - Université Grenoble Alpes); Gilles Claude Stupfler (CREST - Centre de Recherche en Economie et Statistique [Bruz] - ENSAI - Ecole Nationale de la Statistique et de l'Analyse de l'Information [Bruz]); Antoine Usseglio-Carleve (TSE - Toulouse School of Economics - UT1 - Université Toulouse 1 Capitole - Université Fédérale Toulouse Midi-Pyrénées - EHESS - École des hautes études en sciences sociales - CNRS - Centre National de la Recherche Scientifique - INRAE - Institut National de Recherche pour l’Agriculture, l’Alimentation et l’Environnement)
    Abstract: Expectiles define a least squares analogue of quantiles. They have been the focus of a substantial quantity of research in the context of actuarial and financial risk assessment over the last decade. The behaviour and estimation of unconditional extreme expectiles using independent and identically distributed heavy-tailed observations has been investigated in a recent series of papers. We build here a general theory for the estimation of extreme conditional expectiles in heteroscedastic regression models with heavy-tailed noise; our approach is supported by general results of independent interest on,residual-based extreme value estimators in heavy-tailed regression models, and is intended to cope with covariates having a large but fixed dimension. We demonstrate how our results can be applied to a wide class of important examples, among which linear models, single-index models as well as ARMA and GARCH time series models. Our estimators are showcased on a numerical simulation study and on real sets of actuarial and financial data.
    Keywords: Tail empirical process of residuals,Single-indes model,Residual-based estimators,Regression models,Heteroscedasticity,Heavy-tailed distribution,Extreme value analysis,Expectiles
    Date: 2021–06
  4. By: Dat Thanh Tran; Juho Kanniainen; Moncef Gabbouj; Alexandros Iosifidis
    Abstract: Data normalization is one of the most important preprocessing steps when building a machine learning model, especially when the model of interest is a deep neural network. This is because deep neural network optimized with stochastic gradient descent is sensitive to the input variable range and prone to numerical issues. Different than other types of signals, financial time-series often exhibit unique characteristics such as high volatility, non-stationarity and multi-modality that make them challenging to work with, often requiring expert domain knowledge for devising a suitable processing pipeline. In this paper, we propose a novel data-driven normalization method for deep neural networks that handle high-frequency financial time-series. The proposed normalization scheme, which takes into account the bimodal characteristic of financial multivariate time-series, requires no expert knowledge to preprocess a financial time-series since this step is formulated as part of the end-to-end optimization process. Our experiments, conducted with state-of-the-arts neural networks and high-frequency data from two large-scale limit order books coming from the Nordic and US markets, show significant improvements over other normalization techniques in forecasting future stock price dynamics.
    Date: 2021–09
  5. By: Geraldine Dany-Knedlik; Alexander Kriwoluzky; Sandra Pasch
    Abstract: Using a wide variety of business cycle dating and filtering techniques, this paper documents the cyclical behavior of the post-tax income distribution in the US. First, all incomes are cyclical and co-move with the business cycle. Second, lower and higher income individuals experience significantly larger fluctuations across the business cycle than middle-income individuals. Third, these fluctuations have become smaller over the course of the Great Moderation for the bottom and the very top income individuals. With the financial crisis starting in 2009 and its repercussions, the volatilities are again increasing; however, not significantly. These findings are independent from the method to extract the business cycle component.
    Keywords: Cyclicity of the income distribution, business cycle
    JEL: E01 E32 D31
    Date: 2021
  6. By: Giovanni Pellegrino (Aarhus University); Efrem Castelnuovo (University of Padova); Giovanni Caggiano (Monash University and University of Padova)
    Abstract: We employ a nonlinear VAR framework and a state-of-the-art identification strategy to document the large response of real activity to a financial uncertainty shock during and in the aftermath of the great recession. We replicate this evidence with an estimated DSGE framework featuring a concept of uncertainty comparable to that in our VAR. We then use the estimated framework to quantify the output loss due to the large uncertainty shock that materialized in 2008Q3. We find such a shock to be able to explain about 60% of the output loss in the 2008-2014 period. The same estimated model unveils the role successfully played by the Federal Reserve in limiting the output loss that would otherwise have occurred had monetary policy been conducted as in normal times. Finally, we show that the rule estimated during the great recession is able to deliver an economic outcome closer to the flexible price one than the rule describing the Federal Reserve's conduct in normal times.
    Keywords: Uncertainty shock, nonlinear IVAR, nonlinear DSGE framework, minimum-distance estimation, great recession
    JEL: C22 E32 E52
    Date: 2021–03

This nep-ets issue is ©2021 by Jaqueson K. Galimberti. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.