nep-ets New Economics Papers
on Econometric Time Series
Issue of 2026–02–09
ten papers chosen by
Simon Sosvilla-Rivero, Instituto Complutense de Análisis Económico


  1. BASTION: A Bayesian Framework for Trend and Seasonality Decomposition By Jason B. Cho; David S. Matteson
  2. Nonlinear Dynamic Factor Analysis With a Transformer Network By Oliver Snellman
  3. A General Randomized Test for Alpha By Daniele Massacci; Lucio Sarno; Lorenzo Trapani; Pierluigi Vallarino
  4. Finite-Sample Properties of Model Specification Tests for Multivariate Dynamic Regression Models By Koichiro Moriya; Akihiko Noda
  5. Brownian ReLU(Br-ReLU): A New Activation Function for a Long-Short Term Memory (LSTM) Network By George Awiakye-Marfo; Elijah Agbosu; Victoria Mawuena Barns; Samuel Asante Gyamerah
  6. Predictive modeling the past By Paker, Meredith; Stephenson, Judy; Wallis, Patrick
  7. Specification Choice in Local Projections: Evidence from Monetary Policy Shocks By Eric Fortier
  8. Fast and user-friendly econometrics estimations: The R package fixest By Laurent R. Berg\'e; Kyle Butts; Grant McDermott
  9. A machine learning approach to volatility forecasting By Kim Christensen; Mathias Siggaard; Bezirgen Veliyev
  10. Trade uncertainty impact on stock-bond correlations: Insights from conditional correlation models By Demetrio Lacava; Edoardo Otranto

  1. By: Jason B. Cho; David S. Matteson
    Abstract: We introduce BASTION (Bayesian Adaptive Seasonality and Trend DecompositION), a flexible Bayesian framework for decomposing time series into trend and multiple seasonality components. We cast the decomposition as a penalized nonparametric regression and establish formal conditions under which the trend and seasonal components are uniquely identifiable, an issue only treated informally in the existing literature. BASTION offers three key advantages over existing decomposition methods: (1) accurate estimation of trend and seasonality amidst abrupt changes, (2) enhanced robustness against outliers and time-varying volatility, and (3) robust uncertainty quantification. We evaluate BASTION against established methods, including TBATS, STR, and MSTL, using both simulated and real-world datasets. By effectively capturing complex dynamics while accounting for irregular components such as outliers and heteroskedasticity, BASTION delivers a more nuanced and interpretable decomposition. To support further research and practical applications, BASTION is available as an R package at https://github.com/Jasoncho0914/BASTION
    Date: 2026–01
    URL: https://d.repec.org/n?u=RePEc:arx:papers:2601.18052
  2. By: Oliver Snellman
    Abstract: The paper develops a Transformer architecture for estimating dynamic factors from multivariate time series data under flexible identification assumptions. Performance on small datasets is improved substantially by using a conventional factor model as prior information via a regularization term in the training objective. The results are interpreted with Attention matrices that quantify the relative importance of variables and their lags for the factor estimate. Time variation in Attention patterns can help detect regime switches and evaluate narratives. Monte Carlo experiments suggest that the Transformer is more accurate than the linear factor model, when the data deviate from linear-Gaussian assumptions. An empirical application uses the Transformer to construct a coincident index of U.S. real economic activity.
    Date: 2026–01
    URL: https://d.repec.org/n?u=RePEc:arx:papers:2601.12039
  3. By: Daniele Massacci (King’s College London); Lucio Sarno (University of Cambridge, Centre for Economic Policy Researche); Lorenzo Trapani (Università di Pavia, University of Leicester); Pierluigi Vallarino (Erasmus University Rotterdam and Tinbergen Institute)
    Abstract: We propose a methodology to construct tests for the null hypothesis that the pricing errors of a panel of asset returns are jointly equal to zero in a linear factor asset pricing model - that is, the null of "zero alpha". We consider, as a leading example, a model with observable, tradable factors, but we also develop extensions to accommodate for non-tradable and latent factors. The test is based on equation-by-equation estimation, using a randomized version of the estimated alphas, which only requires rates of convergence. The distinct features of the proposed methodology are that it does not require the estimation of any covariance matrix, and that it allows for both N and T to pass to infinity, with the former possibly faster than the latter. Further, unlike extant approaches, the procedure can accommodate conditional heteroskedasticity, non-Gaussianity, and even strong cross-sectional dependence in the error terms. We also propose a de-randomized decision rule to choose in favor or against the correct specification of a linear factor pricing model. Monte Carlo simulations show that the test has satisfactory properties and it compares favorably to several existing tests. The usefulness of the testing procedure is illustrated through an application of linear factor pricing models to price the constituents of the S&P 500.
    Date: 2025–07–25
    URL: https://d.repec.org/n?u=RePEc:tin:wpaper:20250045
  4. By: Koichiro Moriya; Akihiko Noda
    Abstract: This paper proposes a new multivariate model specification test that generalizes Durbin regression to a seemingly unrelated regression framework and reframes the Durbin approach as a GLS-class estimator. The proposed estimator explicitly models cross-equation dependence and the joint second-order dynamics of regressors and disturbances. It remains consistent under a comparatively weak dependence condition in which conventional OLS- and GLS-based estimators can be inconsistent, and it is asymptotically efficient under stronger conditions. Monte Carlo experiments indicate that the associated Wald test achieves improved size control and competitive power in finite samples, especially when combined with a bootstrap-based bias correction. An empirical application further illustrates that the proposed procedure delivers stable inference and is practically useful for multi-equation specification testing.
    Date: 2026–01
    URL: https://d.repec.org/n?u=RePEc:arx:papers:2601.21272
  5. By: George Awiakye-Marfo; Elijah Agbosu; Victoria Mawuena Barns; Samuel Asante Gyamerah
    Abstract: Deep learning models are effective for sequential data modeling, yet commonly used activation functions such as ReLU, LeakyReLU, and PReLU often exhibit gradient instability when applied to noisy, non-stationary financial time series. This study introduces BrownianReLU, a stochastic activation function induced by Brownian motion that enhances gradient propagation and learning stability in Long Short-Term Memory (LSTM) networks. Using Monte Carlo simulation, BrownianReLU provides a smooth, adaptive response for negative inputs, mitigating the dying ReLU problem. The proposed activation is evaluated on financial time series from Apple, GCB, and the S&P 500, as well as LendingClub loan data for classification. Results show consistently lower Mean Squared Error and higher $R^2$ values, indicating improved predictive accuracy and generalization. Although ROC-AUC metric is limited in classification tasks, activation choice significantly affects the trade-off between accuracy and sensitivity, with Brownian ReLU and the selected activation functions yielding practically meaningful performance.
    Date: 2026–01
    URL: https://d.repec.org/n?u=RePEc:arx:papers:2601.16446
  6. By: Paker, Meredith; Stephenson, Judy; Wallis, Patrick
    Abstract: Understanding long-run economic growth requires reliable historical data, yet the vast majority of long-run economic time series are drawn from incomplete records with significant temporal and geographic gaps. Conventional solutions to these gaps rely on linear regressions that risk bias or overfitting when data are scarce. We introduce “past predictive modeling, ” a framework that leverages machine learning and out-of-sample predictive modeling techniques to reconstruct representative historical time series from scarce data. Validating our approach using nominal wage data from England, 1300-1900, we show that this new method leads to more accurate and generalizable estimates, with bootstrapped standard errors 72% lower than benchmark linear regressions. Beyond just bettering accuracy, these improved wage estimates for England yield new insights into the impact of the Black Death on inequality, the economic geography of pre-industrial growth, and productivity over the long-run.
    Keywords: machine learning; predictive modeling; wages; black death; industrial revolution
    JEL: J31 C53 N33 N13 N63
    Date: 2025–06–13
    URL: https://d.repec.org/n?u=RePEc:ehl:wpaper:128852
  7. By: Eric Fortier (PhD Candidate, Simon Fraser University)
    Abstract: This study examines how specification choices in local projections influence the estimation of impulse responses to monetary policy shocks. Using monthly U.S. data from 1983 to 2007 and the Aruoba and Drechsel (2024) shock series, I systematically compare levels and long-differences specifications across 12 control sets and multiple lag lengths. The results are evaluated both qualitatively by benchmarking impulse responses against theory, standard beliefs, and prior evidence and quantitatively, using information criteria (AIC, BIC, CV). The findings show that the long-differences specification produces distorted long-term dynamics. In contrast, the levels specification, when paired with a robust control set, generates well-behaved responses. These results stress the importance of careful specification and provide practical guidance for researchers applying local projections to monetary policy shocks.
    Date: 2026–01
    URL: https://d.repec.org/n?u=RePEc:sfu:sfudps:dp26-01
  8. By: Laurent R. Berg\'e; Kyle Butts; Grant McDermott
    Abstract: fixest is an R package for fast and flexible econometric estimation, providing a comprehensive toolkit for applied researchers. The package particularly excels at fixed-effects estimation, supported by a novel fixed-point acceleration algorithm implemented in C++. This algorithm achieves rapid convergence across a broad class of data contexts and further enables estimation of complex models, including those with varying slopes, in a highly efficient manner. Beyond computational speed, fixest provides a unified syntax for a wide variety of models: ordinary least squares, instrumental variables, generalized linear models, maximum likelihood, and difference-in-differences estimators. An expressive formula interface enables multiple estimations, stepwise regressions, and variable interpolation in a single call, while users can make on-the-fly inference adjustments using a variety of built-in robust standard errors. Finally, fixest provides methods for publication-ready regression tables and coefficient plots. Benchmarks against leading alternatives in R, Python, and Julia demonstrate best-in-class performance, and the paper includes many worked examples illustrating the core functionality.
    Date: 2026–01
    URL: https://d.repec.org/n?u=RePEc:arx:papers:2601.21749
  9. By: Kim Christensen; Mathias Siggaard; Bezirgen Veliyev
    Abstract: We inspect how accurate machine learning (ML) is at forecasting realized variance of the Dow Jones Industrial Average index constituents. We compare several ML algorithms, including regularization, regression trees, and neural networks, to multiple Heterogeneous AutoRegressive (HAR) models. ML is implemented with minimal hyperparameter tuning. In spite of this, ML is competitive and beats the HAR lineage, even when the only predictors are the daily, weekly, and monthly lags of realized variance. The forecast gains are more pronounced at longer horizons. We attribute this to higher persistence in the ML models, which helps to approximate the long-memory of realized variance. ML also excels at locating incremental information about future volatility from additional predictors. Lastly, we propose a ML measure of variable importance based on accumulated local effects. This shows that while there is agreement about the most important predictors, there is disagreement on their ranking, helping to reconcile our results.
    Date: 2026–01
    URL: https://d.repec.org/n?u=RePEc:arx:papers:2601.13014
  10. By: Demetrio Lacava; Edoardo Otranto
    Abstract: This paper investigates the impact of Trade Policy Uncertainty (TPU) on stock-bond correlation dynamics in the United States. Using daily data on major U.S. stock indices and the 10-year Treasury bond from 2015 to 2025, we estimate correlation within a two-step GARCH-based framework, relying on multivariate specifications, including Constant Conditional Correlation (CCC), Smooth Transition Conditional Correlation (STCC), and Dynamic Conditional Correlation (DCC) models. We extend these frameworks by incorporating TPU index and a presidential dummy to capture effects of trade uncertainty and government cycles. The findings show that constant correlation models are strongly rejected in favor of time-varying specifications. Both STCC and DCC models confirm TPU's central role in driving correlation dynamics, with significant differences across political regimes. DCC models augmented with TPU and political effects deliver the best in-sample fit and strongest forecasting performance, as measured by statistical and economic loss functions.
    Date: 2026–01
    URL: https://d.repec.org/n?u=RePEc:arx:papers:2601.21447

This nep-ets issue is ©2026 by Simon Sosvilla-Rivero. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at https://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.