|
on Econometric Time Series |
Issue of 2025–08–11
seventeen papers chosen by Simon Sosvilla-Rivero, Instituto Complutense de Análisis Económico |
By: | Bo Hu; Joon Y. Park; Junhui Qian |
Abstract: | This paper introduces a novel approach to investigate the dynamics of state distributions, which accommodate both cross-sectional distributions of repeated panels and intra-period distributions of a time series observed at high frequency. In our approach, densities of the state distributions are regarded as functional elements in a Hilbert space, and are assumed to follow a functional autoregressive model. We propose an estimator for the autoregressive operator, establish its consistency, and provide tools and asymptotics to analyze the forecast of state density and the moment dynamics of state distributions. We apply our methodology to study the time series of distributions of the GBP/USD exchange rate intra-month returns and the time series of cross-sectional distributions of the NYSE stocks monthly returns. Finally, we conduct simulations to evaluate the density forecasts based on our model. |
Date: | 2025–05 |
URL: | https://d.repec.org/n?u=RePEc:arx:papers:2505.15763 |
By: | John Michael, Riveros-Gavilanes |
Abstract: | English version This document provides a practical introduction to the standard methodology for estimating Vector Autoregression (VAR) models and their Vector Error Correction (VEC) approach in the context of cointegration. It covers basic concepts such as stationarity and unit roots, unit root testing, cointegration analysis, and the general estimation framework using Stata. The text does not delve into the mathematical formalization of the models but rather aims to serve as an applied estimation guide for undergraduate students. Spanish version Este documento presenta una introducción practica a la metodología estándar de la estimación de vectores auto-regresivos (VAR) y su aproximación de vectores con corrección del error (VEC) en el contexto de la cointegración. El documento presenta unas nociones básicas sobre el concepto de estacionariedad y raíz unitaria, la estimación de pruebas de raíces unitarias, la revisión de cointegración y el esquema general de estimación bajo el programa Stata. El texto no ahonda con la profundización matemática de los modelos sino más que nada aspira a ser una guía aplicada de estimación para los estudiantes de pregrado. |
Keywords: | Vector autoregresion; cointegracion; estacionariedad; series de tiempo |
JEL: | C10 C13 C32 |
Date: | 2025–03–09 |
URL: | https://d.repec.org/n?u=RePEc:pra:mprapa:124015 |
By: | Roberto Esposti (Department of Economics and Social Sciences, Universita' Politecnica delle Marche (UNIVPM)) |
Abstract: | This paper investigates the interdependence among prices in the commodity and natural resource market segment. The analysis is performed using a large dataset made of about 50 commodity prices observed with monthly frequency over a period of almost half a century (1980-2024). These different commodities are clustered in five groups (energy, metals, agriculture, food, other raw materials) in order to discriminate the interdependence within and between groups. The adopted method consists in building a Commodity Price Network (CPN) defined via Granger causality tests. These tests are performed with two alternative empirical strategies: pairwise VAR models estimation (pairwise Granger Causality) and sparse VAR models estimation (sparse VAR Granger Causality). Both price levels and price first differences are considered in order to take the possible non-stationarity or price series into account. Network analysis is performed on the different networks obtained using these alternative series and modelling approaches. Results suggest relevant differences across series and methods but some solid results also emerges, particularly pointing to a generalized interdependence that still assigns a central role to some metals and agricultural products. |
Keywords: | Commodity Prices, Price Interdependence, Granger Causality, Network Analysis, Sparse VAR Models. |
JEL: | C32 Q02 O13 |
Date: | 2025–07 |
URL: | https://d.repec.org/n?u=RePEc:anc:wpaper:498 |
By: | Gabor Petnehazi; Laith Al Shaggah; Jozsef Gall; Bernadett Aradi |
Abstract: | This study explores the potential of zero-shot time series forecasting, an innovative approach leveraging pre-trained foundation models, to forecast mortality rates without task-specific fine-tuning. We evaluate two state-of-the-art foundation models, TimesFM and CHRONOS, alongside traditional and machine learning-based methods across three forecasting horizons (5, 10, and 20 years) using data from 50 countries and 111 age groups. In our investigations, zero-shot models showed varying results: while CHRONOS delivered competitive shorter-term forecasts, outperforming traditional methods like ARIMA and the Lee-Carter model, TimesFM consistently underperformed. Fine-tuning CHRONOS on mortality data significantly improved long-term accuracy. A Random Forest model, trained on mortality data, achieved the best overall performance. These findings underscore the potential of zero-shot forecasting while highlighting the need for careful model selection and domain-specific adaptation. |
Date: | 2025–05 |
URL: | https://d.repec.org/n?u=RePEc:arx:papers:2505.13521 |
By: | Gabriel Rodriguez (Departamento de Economía de la Pontificia Universidad Católica del Perú); Mauricio Alvarado (Departamento de Economía de la Pontificia Universidad Católica del Perú) |
Abstract: | This paper examines the evolution of the inflation uncertainty-inflation relationship in seven Latin American countries and the G7 from Q1 1948 to Q4 2023, using the time-varying parameter stochastic volatility in mean (TVP-SVM) model of Chan (2017) and its extension incorporating time-varying mixture innovations (TVP-SVM-TVMI) from Hou (2020). The key findings are as follows: (i) the TVP-SVM model is preferred in 8 out of 14 countries; (ii) inflation uncertainty has been higher in Latin America than in the G7, particularly during the 1980s "lost decade"; (iii) log-inflation uncertainty is more persistent in Latin America; (iv) there is no evidence supporting the hypothesis of Friedman (1977) in any of the countries analyzed; (v) the Cukierman-Meltzer hypothesis (1986) holds, as the uncertainty-inflation relationship is positive and time-varying in all countries; (vi) this relationship is stronger and statistically significant during periods of high inflation uncertainty; and(vii) there is evidence of more structural breaks in this relationship in Latin America than in the G7. Palabras claves: Inflation Uncertainty, Inflation, Latin America, G7, Bayesian Estimation and Comparison, Stochastic Volatility in Mean, Time-Varying Parameters, Structural Breaks. JEL Classification-JE: C11, C15, C58, E31, N16. |
Keywords: | Inflation Uncertainty, Inflation, Latin America, G7, Bayesian Estimation and Comparison, Stochastic Volatility in Mean, Time-Varying Parameters, Structural Breaks. |
Date: | 2025 |
URL: | https://d.repec.org/n?u=RePEc:pcp:pucwps:wp00544 |
By: | Daniele Massacci; Lucio Sarno; Lorenzo Trapani; Pierluigi Vallarino |
Abstract: | We propose a methodology to construct tests for the null hypothesis that the pricing errors of a panel of asset returns are jointly equal to zero in a linear factor asset pricing model -- that is, the null of "zero alpha". We consider, as a leading example, a model with observable, tradable factors, but we also develop extensions to accommodate for non-tradable and latent factors. The test is based on equation-by-equation estimation, using a randomized version of the estimated alphas, which only requires rates of convergence. The distinct features of the proposed methodology are that it does not require the estimation of any covariance matrix, and that it allows for both N and T to pass to infinity, with the former possibly faster than the latter. Further, unlike extant approaches, the procedure can accommodate conditional heteroskedasticity, non-Gaussianity, and even strong cross-sectional dependence in the error terms. We also propose a de-randomized decision rule to choose in favor or against the correct specification of a linear factor pricing model. Monte Carlo simulations show that the test has satisfactory properties and it compares favorably to several existing tests. The usefulness of the testing procedure is illustrated through an application of linear factor pricing models to price the constituents of the S&P 500. |
Date: | 2025–07 |
URL: | https://d.repec.org/n?u=RePEc:arx:papers:2507.17599 |
By: | Anna Bykhovskaya; Vadim Gorin; Sasha Sodin |
Abstract: | The paper analyzes four classical signal-plus-noise models: the factor model, spiked sample covariance matrices, the sum of a Wigner matrix and a low-rank perturbation, and canonical correlation analysis with low-rank dependencies. The objective is to construct confidence intervals for the signal strength that are uniformly valid across all regimes - strong, weak, and critical signals. We demonstrate that traditional Gaussian approximations fail in the critical regime. Instead, we introduce a universal transitional distribution that enables valid inference across the entire spectrum of signal strengths. The approach is illustrated through applications in macroeconomics and finance. |
Date: | 2025–07 |
URL: | https://d.repec.org/n?u=RePEc:arx:papers:2507.18554 |
By: | Sourojyoti Barick; Sudip Ratan Chandra |
Abstract: | This paper explores a comprehensive class of time-changed stochastic processes constructed by subordinating Brownian motion with Levy processes, where the subordination is further governed by stochastic arrival mechanisms such as the Cox Ingersoll Ross (CIR) and Chan Karolyi Longstaff Sanders (CKLS) processes. These models extend classical jump frameworks like the Variance Gamma (VG) and CGMY processes, allowing for more flexible modeling of market features such as jump clustering, heavy tails, and volatility persistence. We first revisit the theory of Levy subordinators and establish strong consistency results for the VG process under Gamma subordination. Building on this, we prove asymptotic normality for both the VG and VGSA (VG with stochastic arrival) processes when the arrival process follows CIR or CKLS dynamics. The analysis is then extended to the more general CGMY process under stochastic arrival, for which we derive analogous consistency and limit theorems under positivity and regularity conditions on the arrival process. A simulation study accompanies the theoretical work, confirming our results through Monte Carlo experiments, with visualizations and normality testing (via Shapiro-Wilk statistics) that show approximate Gaussian behavior even for processes driven by heavy-tailed jumps. This work provides a rigorous and unified probabilistic framework for analyzing subordinated models with stochastic time changes, with applications to financial modeling and inference under uncertainty. |
Date: | 2025–07 |
URL: | https://d.repec.org/n?u=RePEc:arx:papers:2507.17431 |
By: | Francesco Giancaterini; Alain Hecq; Joann Jasiak; Aryan Manafi Neyazi |
Abstract: | This paper introduces a new approach to detect bubbles based on mixed causal and noncausal processes and their tail process representation during explosive episodes. Departing from traditional definitions of bubbles as nonstationary and temporarily explosive processes, we adopt a perspective in which prices are viewed as following a strictly stationary process, with the bubble considered an intrinsic component of its non-linear dynamics. We illustrate our approach on the phenomenon referred to as the "green bubble" in the field of renewable energy investment. |
Date: | 2025–05 |
URL: | https://d.repec.org/n?u=RePEc:arx:papers:2505.14911 |
By: | MINAMI, Koutaroh |
Abstract: | This study explores the potential of machine learning, Long Short-Term Memory (LSTM), to detect asset price bubbles by analyzing prediction errors. Using monthly data of the Nikkei225 Index, I evaluate the performance of LSTM model in forecasting prices and compare with the GSADF test. I find that LSTM’s prediction accuracy significantly deteriorates during periods associated with asset bubbles, suggesting the presence of structural changes. In particular, the LSTM approach of this paper captures both the emergence and collapse of Japan’s late 1980s bubble separately. In addition, it can also capture structural changes related to policy changes in the 2010s Japan, which are not identified by the GSADF test. These findings suggest that machine learning can be used for not only identifying bubbles but also policy evaluations. |
Keywords: | Bubbles, Generalized Supremum Augmented Dickey-Fuller test (GSADF), Machine learning, Long Short Term Memory (LSTM) |
JEL: | G10 G17 |
Date: | 2025–06 |
URL: | https://d.repec.org/n?u=RePEc:hit:hcfrwp:g-1-30 |
By: | Papp, Tamás K. (Institute for Advanced Studies, Vienna) |
Abstract: | We propose a flexible, extensible family of distributions for testing Markov Chain Monte Carlo implementations. Distributions are created by nesting simple transformations, which allow various shapes, includingmultiplemodes and fat tails. The resulting distributions can be sampled with high precision using quasi-random sequences, and have closed form (log) density and gradient at each point, making it possible to test gradient-based samplers without automatic differentiation. |
Date: | 2025–08 |
URL: | https://d.repec.org/n?u=RePEc:ihs:ihswps:number60 |
By: | Matthias R. Fengler (University of St. Gallen - SEPS: Economics and Political Sciences; Swiss Finance Institute); Bruno Jäger (Eastern Switzerland University of Applied Sciences); Ostap Okhrin (Dresden University of Technology) |
Abstract: | We study local change-point detection in variance using generalized likelihood ratio tests. Building on Suvorikova & Spokoiny (2017), we utilize the multiplier bootstrap to approximate the unknown, non-asymptotic distribution of the test statistic and introduce a multiplicative bias correction that improves upon the existing additive version. This proposed correction offers a clearer interpretation of the bootstrap estimators while significantly reducing computational costs. Simulation results demonstrate that our method performs comparably to the original approach. We apply it to the growth rates of U.S. inflation, industrial production, and Bitcoin returns. |
Keywords: | generalized likelihood ratio test, multiplier bootstrap, local change point detection, economic and financial variance |
Date: | 2025–06 |
URL: | https://d.repec.org/n?u=RePEc:chf:rpseri:rp2560 |
By: | Roberto Casarin (Ca' Foscari University of Venice); Antonio Peruzzi (Ca' Foscari University of Venice); Davide Raggi (Ca' Foscari University of Venice) |
Abstract: | We study a New Keynesian Phillips curve in which agents deviate from the rational expectation paradigm and forecast inflation using a simple, potentially misspecified autoregressive rule. Consistency criteria à la Hommes and Zhu (2014) between perceived and actual laws of motion of inflation might allow for multiple expectational equilibria. Unfortunately, multiple equilibria models pose challenges for empirical validation. This paper proposes a latent Markov chain process to dynamically separate such equilibria. Moreover, an original Bayesian inference approach based on hierarchical priors is introduced, which naturally offers the possibility of incorporating equilibrium-identifying constraints with various degrees of prior beliefs. Finally, an inference procedure is proposed to assess a posteriori the probability that the theoretical constraints are satisfied and to estimate the equilibrium changes over time. We show that common prior assumptions regarding structural parameters favor the separation of equilibria, thereby making the Bayesian inference a natural framework for Markov–switching Phillips curve models. Empirical evidence obtained from observed inflation, output gap, and the consensus expectations from the Survey of Professional Forecasters supports multiple equilibria, and we find evidence of temporal variation in over- and under-reaction patterns, which, to the best of our knowledge, have not been previously documented. Specifically, we observe that agents tend to underreact to shocks when inflation is high and persistent, whereas they behave substantially as fully informed forecasters when the inflation level is low and stable, i.e., after the mid–nineties. We also find that the model does not suffer from the missing disinflation puzzle during the Great Recession. |
Keywords: | Bounded rationality; Markov Switching; Multiple equilibria; Under-reaction; Bayesian methods; Horseshoe hierarchical priors; Survey of Professional Forecasters |
JEL: | C11 C24 E31 D84 |
Date: | 2025 |
URL: | https://d.repec.org/n?u=RePEc:ven:wpaper:2025:10 |
By: | Alessandro Casini (DEF, University of Rome "Tor Vergata"); Adam McCloskey (University of Colorado at Boulder) |
Abstract: | We consider identification, estimation and inference in high-frequency event study regressions, which have been used widely in the recent macroeconomics, financial economics and political economy literatures. The high-frequency event study method regresses changes in an outcome variable on a measure of unexpected changes in a policy variable in a narrow time window around an event or a policy announcement (e.g., a 30-minute window around an FOMC announcement). We show that, contrary to popular belief, the narrow size of the window is not sufficient for identification. Rather, the population regression coefficient identifies a causal estimand when (i) the effect of the policy shock on the outcome does not depend on the other variables (separability) and (ii) the surprise component of the news or event dominates all other variables that are present in the event window (relative exogeneity). Technically, the latter condition requires the ratio between the variance of the policy shock and that of the other variables to be infinite in the event window. Under these conditions, we establish the causal meaning of the event study estimand corresponding to the regression coefficient and super-consistency of the event study estimator with rate of convergence faster than the parametric rate. We show the asymptotic normality of the estimator and propose bias-corrected inference. We also provide bounds on the worst-case bias and use them to quantify its impact on the worst-case coverage properties of confidence intervals, as well as to construct a bias-aware critical value. Notably, this standard linear regression estimator is robust to general forms of nonlinearity. We apply our results to Nakamura and Steinsson’s (2018a) analysis of the real economic effects of monetary policy, providing a simple empirical procedure to analyze the extent to which the standard event study estimator adequately estimates causal effects of interest. |
Keywords: | Causal effects, Event study, High-frequency data, Identification |
JEL: | C32 C51 |
Date: | 2025–07–28 |
URL: | https://d.repec.org/n?u=RePEc:rtv:ceisrp:608 |
By: | Chen, Fangyi; Chen, Yunxiao; Ying, Zhiliang; Zhou, Kangjie |
Abstract: | Summary: Recurrent event time data arise in many studies, including in biomedicine, public health, marketing and social media analysis. High-dimensional recurrent event data involving many event types and observations have become prevalent with advances in information technology. This article proposes a semiparametric dynamic factor model for the dimension reduction of high-dimensional recurrent event data. The proposed model imposes a low-dimensional structure on the mean intensity functions of the event types while allowing for dependencies. A nearly rate-optimal smoothing-based estimator is proposed. An information criterion that consistently selects the number of factors is also developed. Simulation studies demonstrate the effectiveness of these inference tools. The proposed method is applied to grocery shopping data, for which an interpretable factor structure is obtained. |
Keywords: | counting process; factor analysis; marginal modelling; kernal smoothing; information criterion |
JEL: | C1 |
Date: | 2025–07–21 |
URL: | https://d.repec.org/n?u=RePEc:ehl:lserod:127778 |
By: | Pascal Michaillat |
Abstract: | This paper develops a new algorithm for detecting US recessions in real time. The algorithm constructs millions of recession classifiers by combining unemployment and vacancy data to reduce detection noise. Classifiers are then selected to avoid both false negatives (missed recessions) and false positives (nonexistent recessions). The selected classifiers are therefore perfect, in that they identify all 15 historical recessions in the 1929–2021 training period without any false positives. By further selecting classifiers that lie on the high-precision segment of the anticipation-precision frontier, the algorithm optimizes early detection without sacrificing precision. On average, over 1929–2021, the classifier ensemble signals recessions 2.2 months after their true onset, with a standard deviation of detection errors of 1.9 months. Applied to May 2025 data, the classifier ensemble gives a 71% probability that the US economy is currently in recession. A placebo test and backtests confirm the algorithm’s reliability. The classifier ensembles trained on 1929–2004, 1929–1984, and 1929–1964 data in backtests give a current recession probability of 58%, 83%, and 25%, respectively. |
JEL: | C52 E24 E32 J63 N12 |
Date: | 2025–07 |
URL: | https://d.repec.org/n?u=RePEc:nbr:nberwo:34015 |
By: | Radoslaw Trojanek; Luke Hartigan; Norbert Pfeifer; Miriam Steurer |
Abstract: | Timely transaction-based residential property price indices are crucial for effective monetary and macroprudential policy, yet transaction-based data often suffer from significant reporting delays. Online property platforms, by contrast, provide list prices of properties in real-time. This paper examines whether immediately available online list prices can improve timely now-casts of transaction price movements. Using 16 years of micro-level data from Warsaw and Poznan, we construct quality-adjusted monthly list-price and quarterly transaction-price indices using the hedonic rolling-time-dummy method. We find that list-price indices consistently lead transaction-price indices by one to two months, with the strongest relationship in Warsaw’s larger, more liquid market. Building on this lead-lag relationship, we develop a Mixed Data Sampling (MIDAS) regression framework to nowcast quarterly transaction-price growth using monthly list-price data. Our preferred MIDAS specifications reduce one-quarter-ahead root mean square error by approximately 16-23 percent for Warsaw and 5-15 percent for Poznan relative to standard autoregressive benchmarks. The predictive advantage is greatest when incorporating list-price data from the first or second month of the quarter, as third-month data introduce forward-looking noise. Our results show that properly constructed list-price indices can play an important role to provide early housing market signals, potentially enhancing the timeliness of policy responses. |
Keywords: | MIDAS regression, nowcasting, house price index, hedonic price index, macro-prudential supervision, online price data, rolling time dummy |
JEL: | C43 E01 E31 R31 |
Date: | 2025–08 |
URL: | https://d.repec.org/n?u=RePEc:een:camaaa:2025-45 |