|
on Econometric Time Series |
Issue of 2025–07–28
twenty papers chosen by Simon Sosvilla-Rivero, Instituto Complutense de Análisis Económico |
By: | Guglielmo Maria Caporale; Luis Alberiko Gil-Alana |
Abstract: | This note puts forward a new modelling approach that includes both fractional integration and autoregressive processes in a unified framework. The proposed model is very general and includes other more standard approaches such as the AR(F)IMA models. Some Monte Carlo evidence shows that the suggested framework outperforms standard AR(F)IMA specifications in capturing the properties of the series examined. |
Keywords: | time series modelling, stationarity, fractional integration, autoregressions |
JEL: | C22 C50 |
Date: | 2025 |
URL: | https://d.repec.org/n?u=RePEc:ces:ceswps:_11984 |
By: | Mirko Armillotta (University of Rome Tor Vergata); Paolo Gorgi (Vrije Universiteit Amsterdam and Tinbergen Institute); André Lucas (Vrije Universiteit Amsterdam and Tinbergen Institute) |
Abstract: | This paper presents a novel copula-based autoregressive framework for multilayer arrays of integer-valued time series with tensor structure. It complements recent advances in tensor time series that predominantly focus on real-valued data and overlook the unique properties of integer-valued time series, such as discreteness and non-negativity. Our approach incorporates feedback effects for the time-varying parameters that describe the counts’ temporal dynamics and introduces new identification constraints for parameter estimation. We provide an asymptotic theory for a Two-Stage Maximum Likelihood Estimator (2SMLE) tailored to the new tensor model. The estimator tackles the model’s multidimensionality and interdependence challenges for large-scale count datasets, while at the same time addressing computational challenges inherent to copula parameter estimation. In this way it significantly advances the modeling of count tensors. An application to crime time series demonstrates the practical utility of the proposed methodology. |
Keywords: | INGARCH, tensor autoregression, parameter identification, quasi-likelihood, two-stage estimator |
JEL: | C32 C55 |
Date: | 2025–02–05 |
URL: | https://d.repec.org/n?u=RePEc:tin:wpaper:20250004 |
By: | Cameron Cornell; Lewis Mitchell; Matthew Roughan |
Abstract: | Causal networks offer an intuitive framework to understand influence structures within time series systems. However, the presence of cycles can obscure dynamic relationships and hinder hierarchical analysis. These networks are typically identified through multivariate predictive modelling, but enforcing acyclic constraints significantly increases computational and analytical complexity. Despite recent advances, there remains a lack of simple, flexible approaches that are easily tailorable to specific problem instances. We propose an evolutionary approach to fitting acyclic vector autoregressive processes and introduces a novel hierarchical representation that directly models structural elements within a time series system. On simulated datasets, our model retains most of the predictive accuracy of unconstrained models and outperforms permutation-based alternatives. When applied to a dataset of 100 cryptocurrency return series, our method generates acyclic causal networks capturing key structural properties of the unconstrained model. The acyclic networks are approximately sub-graphs of the unconstrained networks, and most of the removed links originate from low-influence nodes. Given the high levels of feature preservation, we conclude that this cryptocurrency price system functions largely hierarchically. Our findings demonstrate a flexible, intuitive approach for identifying hierarchical causal networks in time series systems, with broad applications to fields like econometrics and social network analysis. |
Date: | 2025–05 |
URL: | https://d.repec.org/n?u=RePEc:arx:papers:2505.12806 |
By: | Janneke van Brummelen (Vrije Universiteit Amsterdam); Paolo Gorgi (Vrije Universiteit Amsterdam and Tinbergen Institute); Siem Jan Koopman (Vrije Universiteit Amsterdam and Tinbergen Institute) |
Abstract: | We develop a score-driven time-varying parameter model where no particular parametric error distribution needs to be specified. The proposed method relies on a versatile spline-based density, which produces a score function that follows a natural cubic spline. This flexible approach nests the Gaussian density as a special case. It can also represent asymmetric and leptokurtic densities that produce outlier-robust updating functions for the time-varying parameter and are often appealing in empirical applications. As leading examples, we consider models where the time-varying parameters appear in the location or in the log-scale of the observations. The static parameter vector of the model can be estimated by means of maximum likelihood and we formally establish some of the asymptotic properties of such estimators. We illustrate the practical relevance of the proposed method in two empirical studies. We employ the location model to filter the mean of the U.S. monthly CPI inflation series and the scale model for volatility filtering of the full panel of daily stock returns from the S&P 500 index. The results show a competitive performance of the method compared to a set of competing models that are available in the existing literature. |
JEL: | C13 C22 |
Date: | 2025–02–16 |
URL: | https://d.repec.org/n?u=RePEc:tin:wpaper:20250011 |
By: | Pál, Tibor; Storti, Giuseppe |
Abstract: | This paper analyses the dynamics of the natural rate of interest (r-star) in the US using a score-driven state-space model within the Laubach–Williams structural framework. Compared to standard score-driven specifications, the proposed model enhances flexibility in variance adjustment by assigning time-varying weights to both the conditional likelihood score and the inertia coefficient in the volatility updating equations. The improved state dependence of volatility dynamics effectively accounts for sudden shifts in volatility persistence induced by highly volatile unexpected events. In addition, allowing time variation in the IS and Phillips curve relationships enables the analysis of structural changes in the US economy that are relevant to monetary policy. The results indicate that the advanced models improve the precision of r-star estimates by responding more effectively to changes in macroeconomic conditions. |
Keywords: | r-star, state-space, Kalman filter, score-driven models |
JEL: | C13 C51 E52 |
Date: | 2025–07–14 |
URL: | https://d.repec.org/n?u=RePEc:pra:mprapa:125338 |
By: | Tony Paul, Nitin |
Abstract: | Traditional GARCH models, while robust, are deterministic and their long-horizon forecasts converge to a static mean, failing to capture the dynamic nature of real markets. Conversely, classical stochastic volatility models often introduce significant implementation and calibration complexity. This paper introduces GARCH-FX (GARCH Forecasting eXtension), a novel and accessible framework that augments the classic GARCH model to generate realistic, stochastic volatility paths without this prohibitive complexity. GARCH-FX is built upon the core strength of GARCH—its ability to estimate long-run variance—but replaces the deterministic multi-step forecast with a stochastic simulation engine. It injects controlled randomness through a Gamma-distributed process, ensuring the forecast path is non-smooth and jagged. Furthermore, it incorporates a modular regime-switching multiplier, providing a flexible interface to inject external views or systematic signals into the forecast’s mean level. The result is a powerful and intuitive framework for generating dynamic long-term volatility scenarios. By separating the drivers of mean-level shifts from local stochastic behavior, GARCHFX aims to provide a practical tool for applications requiring realistic market simulations, such as stress-testing, risk analysis, and synthetic data generation. |
Keywords: | Stochastic Volatility Forecasting, GARCH Extensions, Regime-Switching Volatility, Gamma-Distributed, Volatility, Volatility Forecast Uncertainty, Nonlinear GARCH Models, Stochastic Vol Forecast, Financial Time Series, Heteroskedasticity Dynamics, Gamma Noise in Volatility |
JEL: | C22 C53 C6 |
Date: | 2025–07–10 |
URL: | https://d.repec.org/n?u=RePEc:pra:mprapa:125321 |
By: | Anubha Goel; Puneet Pasricha; Martin Magris; Juho Kanniainen |
Abstract: | Time series foundation models (FMs) have emerged as a popular paradigm for zero-shot multi-domain forecasting. These models are trained on numerous diverse datasets and claim to be effective forecasters across multiple different time series domains, including financial data. In this study, we evaluate the effectiveness of FMs, specifically the TimesFM model, for volatility forecasting, a core task in financial risk management. We first evaluate TimesFM in its pretrained (zero-shot) form, followed by our custom fine-tuning procedure based on incremental learning, and compare the resulting models against standard econometric benchmarks. While the pretrained model provides a reasonable baseline, our findings show that incremental fine-tuning, which allows the model to adapt to new financial return data over time, is essential for learning volatility patterns effectively. Fine-tuned variants not only improve forecast accuracy but also statistically outperform traditional models, as demonstrated through Diebold-Mariano and Giacomini-White tests. These results highlight the potential of foundation models as scalable and adaptive tools for financial forecasting-capable of delivering strong performance in dynamic market environments when paired with targeted fine-tuning strategies. |
Date: | 2025–05 |
URL: | https://d.repec.org/n?u=RePEc:arx:papers:2505.11163 |
By: | Guglielmo Maria Caporale; Luis Alberiko Gil-Alana |
Abstract: | This paper puts forward a general statistical model in the time domain based on the concept of fractional integration. More specifically, in the proposed framework instead of imposing that the roots are strictly on the unit circle, we also allow them to be within the unit circle. This approach enables us to specify the time series in terms of its infinite past, with a rate of dependence between the observations much smaller than that produced by the classic I(d) representations. Both Monte Carlo experiments and empirical applications to climatological and financial data show that the proposed approach performs well. |
Keywords: | fractional integration, unit roots, testing procedure |
JEL: | C22 |
Date: | 2025 |
URL: | https://d.repec.org/n?u=RePEc:ces:ceswps:_11983 |
By: | Lukas Wiechers (Paderborn University) |
Abstract: | Standard empirical methods for the identification of rational bubbles in asset markets solely rely on examining explosive time series behavior but do not contain any additional information about the fundamental value and the bubble component. However, obtaining an explicit fundamental solution gives a reasonable starting point for estimating these two components simultaneously. In a decomposition approach on monthly S&P 500 stock data from 1871 to 2023, I highlight the importance of market participant’s changing information set over time, leading to estimation results that fit the underlying data much better than in an ex-post analysis. Bubbles become analyzable not only on grounds of explosive time series behavior, but also in terms of their size. I further derive a bubble cycle that depicts periods with autoregressive patterns relatable to bubbles. Moreover, by engaging a growth rate perspective, real price growth rates become attributable to fundamental and non-fundamental bubble factors. |
Keywords: | Bubble Cycle, Dividend-Price Ratio, Exuberance, Rational Bubble, Time-Varying Discount Rate |
JEL: | C14 G12 G14 |
Date: | 2025–07 |
URL: | https://d.repec.org/n?u=RePEc:pdn:ciepap:163 |
By: | Federico Carlini (LUISS Business School); Mirco Rubin (EDHEC Business School); Pierluigi Vallarino (Erasmus University Rotterdam and Tinbergen Institute) |
Abstract: | We propose a new rank-based test for the number of common primitive shocks, q, in large panel data. After estimating a VAR(1) model on r static factors extracted by principal component analysis, we estimate the number of common primitive shocks by testing the rank of the VAR residuals’ covariance matrix. The new test is based on the asymptotic distribution of the sum of the smallest r − q eigenvalues of the residuals’ covariance matrix. We develop both plug-in and bootstrap versions of this eigenvalue-based test. The eigenvectors associated to the q largest eigenvalues allow us to construct an easy-to-implement estimator of the common primitive shocks. We illustrate our testing and estimation procedures with applications to panels of macroeconomic variables and individual stocks’ volatilities. |
JEL: | C12 C23 C38 |
Date: | 2025–03–07 |
URL: | https://d.repec.org/n?u=RePEc:tin:wpaper:20250016 |
By: | Qingyu Li; Chiranjib Mukhopadhyay; Abolfazl Bayat; Ali Habibnia |
Abstract: | Recent advances in quantum computing have demonstrated its potential to significantly enhance the analysis and forecasting of complex classical data. Among these, quantum reservoir computing has emerged as a particularly powerful approach, combining quantum computation with machine learning for modeling nonlinear temporal dependencies in high-dimensional time series. As with many data-driven disciplines, quantitative finance and econometrics can hugely benefit from emerging quantum technologies. In this work, we investigate the application of quantum reservoir computing for realized volatility forecasting. Our model employs a fully connected transverse-field Ising Hamiltonian as the reservoir with distinct input and memory qubits to capture temporal dependencies. The quantum reservoir computing approach is benchmarked against several econometric models and standard machine learning algorithms. The models are evaluated using multiple error metrics and the model confidence set procedures. To enhance interpretability and mitigate current quantum hardware limitations, we utilize wrapper-based forward selection for feature selection, identifying optimal subsets, and quantifying feature importance via Shapley values. Our results indicate that the proposed quantum reservoir approach consistently outperforms benchmark models across various metrics, highlighting its potential for financial forecasting despite existing quantum hardware constraints. This work serves as a proof-of-concept for the applicability of quantum computing in econometrics and financial analysis, paving the way for further research into quantum-enhanced predictive modeling as quantum hardware capabilities continue to advance. |
Date: | 2025–05 |
URL: | https://d.repec.org/n?u=RePEc:arx:papers:2505.13933 |
By: | Philippe Goulet Coulombe; Karin Klieber |
Abstract: | Local projections (LPs) are widely used in empirical macroeconomics to estimate impulse responses to policy interventions. Yet, in many ways, they are black boxes. It is often unclear what mechanism or historical episodes drive a particular estimate. We introduce a new decomposition of LP estimates into the sum of contributions of historical events, which is the product, for each time stamp, of a weight and the realization of the response variable. In the least squares case, we show that these weights admit two interpretations. First, they represent purified and standardized shocks. Second, they serve as proximity scores between the projected policy intervention and past interventions in the sample. Notably, this second interpretation extends naturally to machine learning methods, many of which yield impulse responses that, while nonlinear in predictors, still aggregate past outcomes linearly via proximity-based weights. Applying this framework to shocks in monetary and fiscal policy, global temperature, and the excess bond premium, we find that easily identifiable events-such as Nixon's interference with the Fed, stagflation, World War II, and the Mount Agung volcanic eruption-emerge as dominant drivers of often heavily concentrated impulse response estimates. |
Date: | 2025–05 |
URL: | https://d.repec.org/n?u=RePEc:arx:papers:2505.12422 |
By: | Elliot L. Epstein; Apaar Sadhwani; Kay Giesecke |
Abstract: | In many financial prediction problems, the behavior of individual units (such as loans, bonds, or stocks) is influenced by observable unit-level factors and macroeconomic variables, as well as by latent cross-sectional effects. Traditional approaches attempt to capture these latent effects via handcrafted summary features. We propose a Set-Sequence model that eliminates the need for handcrafted features. The Set model first learns a shared cross-sectional summary at each period. The Sequence model then ingests the summary-augmented time series for each unit independently to predict its outcome. Both components are learned jointly over arbitrary sets sampled during training. Our approach harnesses the set nature of the cross-section and is computationally efficient, generating set summaries in linear time relative to the number of units. It is also flexible, allowing the use of existing sequence models and accommodating a variable number of units at inference. Empirical evaluations demonstrate that our Set-Sequence model significantly outperforms benchmarks on stock return prediction and mortgage behavior tasks. Code will be released. |
Date: | 2025–05 |
URL: | https://d.repec.org/n?u=RePEc:arx:papers:2505.11243 |
By: | Paker, Meredith; Stephenson, Judy; Wallis, Patrick |
Abstract: | Understanding long-run economic growth requires reliable historical data, yet the vast majority of long-run economic time series are drawn from incomplete records with significant temporal and geographic gaps. Conventional solutions to these gaps rely on linear regressions that risk bias or overfitting when data are scarce. We introduce “past predictive modeling, ” a framework that leverages machine learning and out-of-sample predictive modeling techniques to reconstruct representative historical time series from scarce data. Validating our approach using nominal wage data from England, 1300-1900, we show that this new method leads to more accurate and generalizable estimates, with bootstrapped standard errors 72% lower than benchmark linear regressions. Beyond just bettering accuracy, these improved wage estimates for England yield new insights into the impact of the Black Death on inequality, the economic geography of pre-industrial growth, and productivity over the long-run. |
Keywords: | machine learning; predictive modeling; wages; black death; industrial revolution |
JEL: | J31 C53 N33 N13 N63 |
Date: | 2025–06–13 |
URL: | https://d.repec.org/n?u=RePEc:ehl:lserod:128852 |
By: | Ramon van den Akker; Bas J. M. Werker; Bo Zhou |
Abstract: | We establish an asymptotic framework for the statistical analysis of the stochastic contextual multi-armed bandit problem (CMAB), which is widely employed in adaptively randomized experiments across various fields. While algorithms for maximizing rewards or, equivalently, minimizing regret have received considerable attention, our focus centers on statistical inference with adaptively collected data under the CMAB model. To this end we derive the limit experiment (in the Hajek-Le Cam sense). This limit experiment is highly nonstandard and, applying Girsanov's theorem, we obtain a structural representation in terms of stochastic differential equations. This structural representation, and a general weak convergence result we develop, allow us to obtain the asymptotic distribution of statistics for the CMAB problem. In particular, we obtain the asymptotic distributions for the classical t-test (non-Gaussian), Adaptively Weighted tests, and Inverse Propensity Weighted tests (non-Gaussian). We show that, when comparing both arms, validity of these tests requires the sampling scheme to be translation invariant in a way we make precise. We propose translation-invariant versions of Thompson, tempered greedy, and tempered Upper Confidence Bound sampling. Simulation results corroborate our asymptotic analysis. |
Date: | 2025–05 |
URL: | https://d.repec.org/n?u=RePEc:arx:papers:2505.13897 |
By: | Yingjie Kuang; Tianchen Zhang; Zhen-Wei Huang; Zhongjie Zeng; Zhe-Yuan Li; Ling Huang; Yuefang Gao |
Abstract: | Accurately predicting customers' purchase intentions is critical to the success of a business strategy. Current researches mainly focus on analyzing the specific types of products that customers are likely to purchase in the future, little attention has been paid to the critical factor of whether customers will engage in repurchase behavior. Predicting whether a customer will make the next purchase is a classic time series forecasting task. However, in real-world purchasing behavior, customer groups typically exhibit imbalance - i.e., there are a large number of occasional buyers and a small number of loyal customers. This head-to-tail distribution makes traditional time series forecasting methods face certain limitations when dealing with such problems. To address the above challenges, this paper proposes a unified Clustering and Attention mechanism GRU model (CAGRU) that leverages multi-modal data for customer purchase intention prediction. The framework first performs customer profiling with respect to the customer characteristics and clusters the customers to delineate the different customer clusters that contain similar features. Then, the time series features of different customer clusters are extracted by GRU neural network and an attention mechanism is introduced to capture the significance of sequence locations. Furthermore, to mitigate the head-to-tail distribution of customer segments, we train the model separately for each customer segment, to adapt and capture more accurately the differences in behavioral characteristics between different customer segments, as well as the similar characteristics of the customers within the same customer segment. We constructed four datasets and conducted extensive experiments to demonstrate the superiority of the proposed CAGRU approach. |
Date: | 2025–05 |
URL: | https://d.repec.org/n?u=RePEc:arx:papers:2505.13558 |
By: | Li, Mengbing; Shi, Chengchun; Wu, Zhenke; Fryzlewicz, Piotr |
Abstract: | We consider reinforcement learning (RL) in possibly nonstationary environments. Many existing RL algorithms in the literature rely on the stationarity assumption that requires the state transition and reward functions to be constant over time. However, this assumption is restrictive in practice and is likely to be violated in a number of applications, including traffic signal control, robotics and mobile health. In this paper, we develop a model-free test to assess the stationarity of the optimal Q-function based on pre-collected historical data, without additional online data collection. Based on the proposed test, we further develop a change point detection method that can be naturally coupled with existing state-of-the-art RL methods designed in stationary environments for online policy optimization in nonstationary environments. The usefulness of our method is illustrated by theoretical results, simulation studies, and a real data example from the 2018 Intern Health Study. A Python implementation of the proposed procedure is publicly available at https://github.com/limengbinggz/CUSUM-RL . |
Keywords: | change point detection; hypothesis testing; nonstationarity; reinforcement learning |
JEL: | C1 |
Date: | 2025–06–30 |
URL: | https://d.repec.org/n?u=RePEc:ehl:lserod:127507 |
By: | Lin Deng; Michael Stanley Smith; Worapree Maneesoonthorn |
Abstract: | Multivariate distributions that allow for asymmetry and heavy tails are important building blocks in many econometric and statistical models. The Unified Skew-t (UST) is a promising choice because it is both scalable and allows for a high level of flexibility in the asymmetry in the distribution. However, it suffers from parameter identification and computational hurdles that have to date inhibited its use for modeling data. In this paper we propose a new tractable variant of the unified skew-t (TrUST) distribution that addresses both challenges. Moreover, the copula of this distribution is shown to also be tractable, while allowing for greater heterogeneity in asymmetric dependence over variable pairs than the popular skew-t copula. We show how Bayesian posterior inference for both the distribution and its copula can be computed using an extended likelihood derived from a generative representation of the distribution. The efficacy of this Bayesian method, and the enhanced flexibility of both the TrUST distribution and its implicit copula, is first demonstrated using simulated data. Applications of the TrUST distribution to highly skewed regional Australian electricity prices, and the TrUST copula to intraday U.S. equity returns, demonstrate how our proposed distribution and its copula can provide substantial increases in accuracy over the popular skew-t and its copula in practice. |
Date: | 2025–05 |
URL: | https://d.repec.org/n?u=RePEc:arx:papers:2505.10849 |
By: | Ayush Jha; Abootaleb Shirvani; Ali Jaffri; Svetlozar T. Rachev; Frank J. Fabozzi |
Abstract: | This paper develops and estimates a multivariate affine GARCH(1, 1) model with Normal Inverse Gaussian innovations that captures time-varying volatility, heavy tails, and dynamic correlation across asset returns. We generalize the Heston-Nandi framework to a multivariate setting and apply it to 30 Dow Jones Industrial Average stocks. The model jointly supports three core financial applications: dynamic portfolio optimization, wealth path simulation, and option pricing. Closed-form solutions are derived for a Constant Relative Risk Aversion (CRRA) investor's intertemporal asset allocation, and we implement a forward-looking risk-adjusted performance comparison against Merton-style constant strategies. Using the model's conditional volatilities, we also construct implied volatility surfaces for European options, capturing skew and smile features. Empirically, we document substantial wealth-equivalent utility losses from ignoring time-varying correlation and tail risk. These findings underscore the value of a unified econometric framework for analyzing joint asset dynamics and for managing portfolio and derivative exposures under non-Gaussian risks. |
Date: | 2025–05 |
URL: | https://d.repec.org/n?u=RePEc:arx:papers:2505.12198 |
By: | Stijn De Backer; Luis E. C. Rocha; Jan Ryckebusch; Koen Schoors |
Abstract: | The analysis of logarithmic return distributions defined over large time scales is crucial for understanding the long-term dynamics of asset price movements. For large time scales of the order of two trading years, the anticipated Gaussian behavior of the returns often does not emerge, and their distributions often exhibit a high level of asymmetry and bimodality. These features are inadequately captured by the majority of classical models to address financial time series and return distributions. In the presented analysis, we use a model based on the discrete-time quantum walk to characterize the observed asymmetry and bimodality. The quantum walk distinguishes itself from a classical diffusion process by the occurrence of interference effects, which allows for the generation of bimodal and asymmetric probability distributions. By capturing the broader trends and patterns that emerge over extended periods, this analysis complements traditional short-term models and offers opportunities to more accurately describe the probabilistic structure underlying long-term financial decisions. |
Date: | 2025–05 |
URL: | https://d.repec.org/n?u=RePEc:arx:papers:2505.13019 |