nep-ets New Economics Papers
on Econometric Time Series
Issue of 2026–01–26
nineteen papers chosen by
Simon Sosvilla-Rivero, Instituto Complutense de Análisis Económico


  1. A Shrinkage Factor-Augmented VAR for High-Dimensional Macro–Fiscal Dynamics By Kyriakopoulou, Dimitra
  2. Order-Constrained Spectral Causality in Multivariate Time Series By Alejandro Rodriguez Dominguez
  3. Dynamic Mortality Forecasting via Mixed-Frequency State-Space Models By Runze Li; Rui Zhou; David Pitt
  4. Stochastic Volatility Modelling with LSTM Networks: A Hybrid Approach for S&P 500 Index Volatility Forecasting By Anna Perekhodko; Robert \'Slepaczuk
  5. Principled Identification of Structural Dynamic Models By Neville Francis; Peter Reinhard Hansen; Chen Tong
  6. Distribution-Matching Posterior Inference for Incomplete Structural Models By Takashi Kano
  7. Robust Two-Sample Mean Inference under Serial Dependence By Ulrich Hounyo; Min Seong Kim
  8. Estimation of a Dynamic Tobit Model with a Unit Root By Anna Bykhovskaya; James A. Duffy
  9. Forecasting inflation: The sum of the cycles outperforms the whole By Verona, Fabio
  10. Two-Step Regularized HARX to Measure Volatility Spillovers in Multi-Dimensional Systems By Mindy L. Mallory
  11. On the measurement and forecasting of sales volatility: is the quantile approach better? By Nuno Silva
  12. Testing shock independence in Gaussian structural VARs By Dante Amengual; Gabriele Fiorentini; Enrique Sentana
  13. Corrected Forecast Combinations By Chu-An Liu; Andrey L. Vasnev
  14. Explainable Prediction of Economic Time Series Using IMFs and Neural Networks By Pablo Hidalgo; Julio E. Sandubete; Agust\'in Garc\'ia-Garc\'ia
  15. Large-dimensional cointegrated threshold factor models: The Global Term Structure of Interest Rates By Paulo M.M. Rodrigues; Daniel Abreu
  16. Covariate Augmented CUSUM Bubble Monitoring Procedures By Astill, Sam; Taylor, AM Robert; Zu Yang
  17. ProbFM: Probabilistic Time Series Foundation Model with Uncertainty Decomposition By Arundeep Chinta; Lucas Vinh Tran; Jay Katukuri
  18. When the Rules Change: Adaptive Signal Extraction via Kalman Filtering and Markov-Switching Regimes By Sungwoo Kang
  19. A Proposal for a Unified Forecast Accuracy Index (UFAI): Toward Multidimensional and Context-Aware Forecast Evaluation By Chellai, Fatih

  1. By: Kyriakopoulou, Dimitra
    Abstract: We propose a ridge-regularized Factor-Augmented Vector Autoregression (FAVAR) for forecasting macro–fiscal systems in data-rich environments where the cross-sectional dimension is large relative to the available sample. The framework combines principal-component factor extraction with a shrinkage-based VAR for the joint dynamics of observed macro–fiscal variables and latent components. Applying the model to Greece, we show that the extracted factors capture meaningful real and nominal structures, while the ridge-regularized VAR delivers stable impulse responses and coherent short- and medium-term dynamics for variables central to the sovereign debt identity. A recursive out-of-sample evaluation indicates that the ridge-FAVAR systematically improves medium-term forecasting accuracy relative to standard AR benchmarks, particularly for real GDP growth and the interest–growth differential. The results highlight the usefulness of shrinkage-augmented factor models for macro–fiscal forecasting and motivate further econometric work on regularized state-space and structural factor VARs.
    Keywords: FAVAR, Ridge Regression, Forecasting, High-Dimensional Data, Fiscal Policy, Debt Dynamics, Macro–Fiscal Modelling
    JEL: C32 C38 C53 C55 E62 H63
    Date: 2025–12
    URL: https://d.repec.org/n?u=RePEc:pra:mprapa:127158
  2. By: Alejandro Rodriguez Dominguez
    Abstract: We introduce an operator-theoretic framework for causal analysis in multivariate time series based on order-constrained spectral non-invariance. Directional influence is defined as sensitivity of second-order dependence operators to admissible, order-preserving temporal deformations of a designated source component, yielding an intrinsically multivariate causal notion summarized through orthogonally invariant spectral functionals. Under linear Gaussian assumptions, the criterion coincides with linear Granger causality, while beyond this regime it captures collective and nonlinear directional dependence not reflected in pairwise predictability. We establish existence, uniform consistency, and valid inference for the resulting non-smooth supremum--infimum statistics using shift-based randomization that exploits order-induced group invariance, yielding finite-sample exactness under exact invariance and asymptotic validity under weak dependence without parametric assumptions. Simulations demonstrate correct size and strong power against distributed and bulk-dominated alternatives, including nonlinear dependence missed by linear Granger tests with appropriate feature embeddings. An empirical application to a high-dimensional panel of daily financial return series spanning major asset classes illustrates system-level causal monitoring in practice. Directional organization is episodic and stress-dependent, causal propagation strengthens while remaining multi-channel, dominant causal hubs reallocate rapidly, and statistically robust transmission channels are sparse and horizon-heterogeneous even when aggregate lead--lag asymmetry is weak. The framework provides a scalable and interpretable complement to correlation-, factor-, and pairwise Granger-style analyses for complex systems.
    Date: 2026–01
    URL: https://d.repec.org/n?u=RePEc:arx:papers:2601.01216
  3. By: Runze Li; Rui Zhou; David Pitt
    Abstract: High-frequency death counts are now widely available and contain timely information about intra-year mortality dynamics, but most stochastic mortality models are still estimated on annual data and therefore update only when annual totals are released. We propose a mixed-frequency state-space (MF--SS) extension of the Lee--Carter framework that jointly uses annual mortality rates and monthly death counts. The two series are linked through a shared latent monthly mortality factor, with the annual period factor defined as the intra-year average of the monthly factors. The latent monthly factor follows a seasonal ARIMA process, and parameters are estimated by maximum likelihood using an EM algorithm with Kalman filtering and smoothing. This setup enables real-time intra-year updates of the latent state and forecasts as new monthly observations arrive without re-estimating model parameters. Using U.S. data for ages 20--90 over 1999--2019, we evaluate intra-year annual nowcasts and one- to five-year-ahead forecasts. The MF--SS model produces both a direct annual forecast and an annual forecast implied by aggregating monthly projections. In our application, the aggregated monthly forecast is typically more accurate. Incorporating monthly information substantially improves intra-year annual nowcasts, especially after the first few months of the year. As a benchmark, we also fit separate annual and monthly Lee--Carter models and combine their forecasts using temporal reconciliation. Reconciliation improves these independent forecasts but adds little to MF--SS forecasts, consistent with MF--SS pooling information across frequencies during estimation. The MF--SS aggregated monthly forecasts generally outperform both unreconciled and temporally reconciled Lee--Carter forecasts and produce more cautious predictive intervals than the reconciled Lee--Carter approach.
    Date: 2026–01
    URL: https://d.repec.org/n?u=RePEc:arx:papers:2601.05702
  4. By: Anna Perekhodko; Robert \'Slepaczuk
    Abstract: Accurate volatility forecasting is essential in banking, investment, and risk management, because expectations about future market movements directly influence current decisions. This study proposes a hybrid modelling framework that integrates a Stochastic Volatility model with a Long Short Term Memory neural network. The SV model improves statistical precision and captures latent volatility dynamics, especially in response to unforeseen events, while the LSTM network enhances the model's ability to detect complex nonlinear patterns in financial time series. The forecasting is conducted using daily data from the S and P 500 index, covering the period from January 1 1998 to December 31 2024. A rolling window approach is employed to train the model and generate one step ahead volatility forecasts. The performance of the hybrid SV-LSTM model is evaluated through both statistical testing and investment simulations. The results show that the hybrid approach outperforms both the standalone SV and LSTM models and contributes to the development of volatility modelling techniques, providing a foundation for improving risk assessment and strategic investment planning in the context of the S and P 500.
    Date: 2025–12
    URL: https://d.repec.org/n?u=RePEc:arx:papers:2512.12250
  5. By: Neville Francis; Peter Reinhard Hansen; Chen Tong
    Abstract: We take a new perspective on identification in structural dynamic models: rather than imposing restrictions, we optimize an objective. This provides new theoretical insights into traditional Cholesky identification. A correlation-maximizing objective yields an Order- and Scale-Invariant Identification Scheme (OASIS) that selects the orthogonal rotation that best aligns structural shocks with their reduced-form innovations. We revisit a large number of SVAR studies and find, across 22 published SVARs, that the correlations between structural and reduced-form shocks are generally high.
    JEL: C15 C32 E00
    Date: 2026–01
    URL: https://d.repec.org/n?u=RePEc:nbr:nberwo:34623
  6. By: Takashi Kano
    Abstract: This paper introduces a Bayesian inference framework for incomplete structural models, termed distribution-matching posterior inference (DMPI). Extending the minimal econometric interpretation (MEI), DMPI constructs a divergence-based quasi-likelihood using the Jensen-Shannon divergence between theoretical and empirical population-moment distributions, based on a Dirichlet-multinomial structure with additive smoothing. The framework accommodates model misspecification and stochastic singularity. Posterior inference is implemented via a sequential Monte Carlo algorithm with Metropolis-Hastings mutation that jointly samples structural parameters and theoretical moment distributions. Monte Carlo experiments using misspecified New Keynesian (NK) models demonstrate that DMPI yields robust inference and improves distribution-matching coherence by probabilistically down-weighting moment distributions inconsistent with the structural model. An empirical application to U.S. data shows that a parsimonious stochastic singular NK model provides a better fit to business-cycle moments than an overparameterized full-rank counterpart.
    Date: 2026–01
    URL: https://d.repec.org/n?u=RePEc:arx:papers:2601.01077
  7. By: Ulrich Hounyo; Min Seong Kim
    Abstract: We propose robust two-sample tests for comparing means in time series. The framework accommodates a wide range of applications, including structural breaks, treatment-control comparisons, and group-averaged panel data. We first consider series HAR two-sample t-tests, where standardization employs orthonormal basis projections, ensuring valid inference under heterogeneity and nonparametric dependence structures. We propose a Welch-type t-approximation with adjusted degrees of freedom to account for long-run variance heterogeneity across the series. We further develop a series-based HAR wild bootstrap test, extending traditional wild bootstrap methods to the time-series setting. Our bootstrap avoids resampling blocks of observations and delivers superior finite-sample performance.
    Date: 2025–12
    URL: https://d.repec.org/n?u=RePEc:arx:papers:2512.11259
  8. By: Anna Bykhovskaya; James A. Duffy
    Abstract: This paper studies robust estimation in the dynamic Tobit model under local-to-unity (LUR) asymptotics. We show that both Gaussian maximum likelihood (ML) and censored least absolute deviations (CLAD) estimators are consistent, extending results from the stationary case where ordinary least squares (OLS) is inconsistent. The asymptotic distributions of MLE and CLAD are derived; for the short-run parameters they are shown to be Gaussian, yielding standard normal t-statistics. In contrast, although OLS remains consistent under LUR, its t-statistics are not standard normal. These results enable reliable model selection via sequential t-tests based on ML and CLAD, paralleling the linear autoregressive case. Applications to financial and epidemiological time series illustrate their practical relevance.
    Date: 2025–12
    URL: https://d.repec.org/n?u=RePEc:arx:papers:2512.12110
  9. By: Verona, Fabio
    Abstract: Inflation dynamics reflect forces operating at different cycles, from short-lived shocks to longterm structural trends. We introduce the sum-of-the-cycles (SOC) method, which exploits this multifrequency structure of inflation for forecasting. SOC decomposes inflation into cyclical components, applies forecasting models suited to their persistence, and recombines them into an aggregate forecast. Across U.S. inflation measures and horizons, SOC consistently outperforms leading time-series benchmarks, reducing forecast errors by about 25 percent at short horizons and nearly 50 percent at long horizons. During the 2020-21 inflation surge, when many models - including advanced machine-learning methods - struggled, SOC retained strong performance by incorporating shortage indicators. Beyond accuracy, SOC enhances interpretability: financial variables dominate high- and business-cycle frequencies, Phillips Curve models are most informative at medium frequencies, and factor-based methods, forecast combinations, and shortage indices prevail at low frequencies. This combination of accuracy and transparency makes SOC a practical complement to existing tools for inflation forecasting and policy analysis.
    Keywords: inflation forecasting, frequency decomposition, cycles, forecast combination, shortage indicators, Phillips curve, macro-finance
    JEL: C22 C53 E31 E32 E37
    Date: 2026
    URL: https://d.repec.org/n?u=RePEc:zbw:bofrdp:335013
  10. By: Mindy L. Mallory
    Abstract: We identify volatility spillovers across commodities, equities, and treasuries using a hybrid HAR-ElasticNet framework on daily realized volatility for six futures markets over 2002--2025. Our two step procedure estimates own-volatility dynamics via OLS to preserve persistence, then applies ElasticNet regularization to cross-market spillovers. The sparse network structure that emerges shows equity markets (ES, NQ) act as the primary volatility transmitters, while crude oil (CL) ends up being the largest receiver of cross-market shocks. Agricultural commodities stay isolated from the larger network. A simple univariate HAR model achieves equally performing point forecasts as our model, but our approach reveals network structure that univariate models cannot. Joint Impulse Response Functions trace how shocks propagate through the network. Our contribution is to demonstrate that hybrid estimation methods can identify meaningful spillover pathways while preserving forecast performance.
    Date: 2026–01
    URL: https://d.repec.org/n?u=RePEc:arx:papers:2601.03146
  11. By: Nuno Silva
    Abstract: This paper asks how best to estimate and forecast firms’ residualized sales growth volatility, a standard measure of idiosyncratic uncertainty. Using a comprehensive dataset of Portuguese firms from 2006 to 2022, I compare the most common approaches used in the literature with a novel quantile-based method that exploits past cross-sectional information and contemporaneous macroeconomic variables and adjusts for the predictability in sales growth rates. I then estimate forecasting models and conduct a simulation exercise to assess the in-sample and out-of-sample performance of all approaches. The paper contributes to the literature by showing that quantile-based estimates and forecasts outperform traditional methods and that sales growth volatility can be measured with reasonable precision, making it suitable for wider application in empirical work. These findings support the application of quantile-based volatility measures to other low-frequency economic variables, especially those characterized by fat-tailed distributions.
    JEL: C53 D22 G30 L25 G32
    Date: 2025
    URL: https://d.repec.org/n?u=RePEc:ptu:wpaper:w202525
  12. By: Dante Amengual (CEMFI, Centro de Estudios Monetarios y Financieros); Gabriele Fiorentini (Università di Firenze and RCEA); Enrique Sentana (CEMFI, Centro de Estudios Monetarios y Financieros)
    Abstract: We propose specification tests for Gaussian SVAR models identified with short- and long-run restrictions that assess the theoretical justification of the chosen identification scheme by checking the independence of the structural shocks. We consider both moment tests that focus on their coskewness and cokurtosis and contingency table tests with discrete and continuous grids. Our simulations confirm the finite sample reliability of resampling versions of our proposals, and their power against interesting alternatives. We also apply them to two influential studies: Kilian (2009) with short-run restrictions in oil markets and Blanchard and Quah (1989) with long-run ones for the aggregate economy.
    Keywords: Consistent test, coskewness, cokurtosis, independence test, moment tests, oil market, pseudo maximum likelihood estimators, supply and demand shocks.
    JEL: C32 C52 E32 Q41 Q43
    Date: 2025–12
    URL: https://d.repec.org/n?u=RePEc:cmf:wpaper:wp2025_2532
  13. By: Chu-An Liu; Andrey L. Vasnev
    Abstract: This paper proposes corrected forecast combinations when the original combined forecast errors are serially dependent. Motivated by the classic Bates and Granger (1969) example, we show that combined forecast errors can be strongly autocorrelated and that a simple correction--adding a fraction of the previous combined error to the next-period combined forecast--can deliver sizable improvements in forecast accuracy, often exceeding the original gains from combining. We formalize the approach within the conditional risk framework of Gibbs and Vasnev (2024), in which the combined error decomposes into a predictable component (measurable at the forecast origin) and an innovation. We then link this correction to efficient estimation of combination weights under time-series dependence via GLS, allowing joint estimation of weights and an error-covariance structure. Using the U.S. Survey of Professional Forecasters for major macroeconomic indices across various subsamples (including pre and post-2000, GFC, and COVID), we find that a parsimonious correction of the mean forecast with a coefficient around 0.5 is a robust starting point and often yields material improvements in forecast accuracy. For optimal-weight forecasts, the correction substantially mitigates the forecast combination puzzle by turning poorly performing out-of-sample optimal-weight combinations into competitive forecasts.
    Date: 2026–01
    URL: https://d.repec.org/n?u=RePEc:arx:papers:2601.09999
  14. By: Pablo Hidalgo; Julio E. Sandubete; Agust\'in Garc\'ia-Garc\'ia
    Abstract: This study investigates the contribution of Intrinsic Mode Functions (IMFs) derived from economic time series to the predictive performance of neural network models, specifically Multilayer Perceptrons (MLP) and Long Short-Term Memory (LSTM) networks. To enhance interpretability, DeepSHAP is applied, which estimates the marginal contribution of each IMF while keeping the rest of the series intact. Results show that the last IMFs, representing long-term trends, are generally the most influential according to DeepSHAP, whereas high-frequency IMFs contribute less and may even introduce noise, as evidenced by improved metrics upon their removal. Differences between MLP and LSTM highlight the effect of model architecture on feature relevance distribution, with LSTM allocating importance more evenly across IMFs.
    Date: 2025–12
    URL: https://d.repec.org/n?u=RePEc:arx:papers:2512.12499
  15. By: Paulo M.M. Rodrigues; Daniel Abreu
    Abstract: We extend the two-level factor model to account for cointegration between groupspecific factors in large datasets. We propose two nonlinear specifications: (i) a threshold vector error correction model (VECM) that accounts for asymmetric responses across regimes; and (ii) a band VECM that captures discontinuous state-dependent adjustment which activates only when deviations from equilibrium exceed a certain threshold. We examine the small-sample performance of both models through Monte Carlo simulations. In an empirical application, we estimate a band factor VECM on a panel of government bond yields from multiple countries, estimating one global factor and two group-specific factors associated with long- and short-term maturities. The results provide evidence of a discontinuous adjustment in the global term structure of interest rates.
    JEL: E43 C38 C32
    Date: 2025
    URL: https://d.repec.org/n?u=RePEc:ptu:wpaper:w202528
  16. By: Astill, Sam; Taylor, AM Robert; Zu Yang
    Abstract: We explore how information from covariates can be incorporated into the CUSUM based real-time monitoring procedure for explosive asset price bubbles developed in Homm and Breitung (2012). Where dynamic covariates are present in the data generating process, the false positive rate of the basic CUSUM procedure, which is based on the assumption that prices follow a univariate data generating process, under the null of no explosivity will not, in general, be properly controlled, even asymptotically. In contrast, accounting for these relevant covariates in the construction of the CUSUM statistics leads to a procedure whose false positive rate can be controlled using the same asymptotic crossing function as employed by Homm and Breitung (2012). Doing so is also shown to have the potential to significantly increase the chance of detecting an emerging bubble episode in finite samples. We additionally allow for time varying volatility in the innovations driving the model through the use of a kernel-based variance estimator.
    Date: 2026–01–21
    URL: https://d.repec.org/n?u=RePEc:esy:uefcwp:42634
  17. By: Arundeep Chinta; Lucas Vinh Tran; Jay Katukuri
    Abstract: Time Series Foundation Models (TSFMs) have emerged as a promising approach for zero-shot financial forecasting, demonstrating strong transferability and data efficiency gains. However, their adoption in financial applications is hindered by fundamental limitations in uncertainty quantification: current approaches either rely on restrictive distributional assumptions, conflate different sources of uncertainty, or lack principled calibration mechanisms. While recent TSFMs employ sophisticated techniques such as mixture models, Student's t-distributions, or conformal prediction, they fail to address the core challenge of providing theoretically-grounded uncertainty decomposition. For the very first time, we present a novel transformer-based probabilistic framework, ProbFM (probabilistic foundation model), that leverages Deep Evidential Regression (DER) to provide principled uncertainty quantification with explicit epistemic-aleatoric decomposition. Unlike existing approaches that pre-specify distributional forms or require sampling-based inference, ProbFM learns optimal uncertainty representations through higher-order evidence learning while maintaining single-pass computational efficiency. To rigorously evaluate the core DER uncertainty quantification approach independent of architectural complexity, we conduct an extensive controlled comparison study using a consistent LSTM architecture across five probabilistic methods: DER, Gaussian NLL, Student's-t NLL, Quantile Loss, and Conformal Prediction. Evaluation on cryptocurrency return forecasting demonstrates that DER maintains competitive forecasting accuracy while providing explicit epistemic-aleatoric uncertainty decomposition. This work establishes both an extensible framework for principled uncertainty quantification in foundation models and empirical evidence for DER's effectiveness in financial applications.
    Date: 2026–01
    URL: https://d.repec.org/n?u=RePEc:arx:papers:2601.10591
  18. By: Sungwoo Kang
    Abstract: Static linear models of order flow assume constant parameters, failing precisely when they are needed most: during periods of market stress and structural change. This paper proposes a dynamic, state-dependent framework for order flow signal extraction that adapts to shifting market conditions in the Korean stock market. Using daily transaction data from 2020--2024 covering 2, 439 stocks and 2.79 million stock-day observations, we implement three complementary methodologies: (1) an Adaptive Kalman Filter where measurement noise variance is explicitly coupled to market volatility; (2) a three-state Markov-Switching model identifying Bull, Normal, and Crisis regimes; and (3) an Asymmetric Response Function capturing differential investor reactions to positive versus negative shocks. We find that foreign investor predictive power increases 8.9-fold during crisis periods relative to bull markets ($\beta_{crisis}=0.00204$ vs. $\beta_{bull}=0.00023$), while individual investors exhibit momentum-chasing behavior with 6.3 times stronger response to positive shocks. The integrated ``All-Weather'' strategy provides modest drawdown reduction during extreme market events, though challenges remain in the post-COVID high-rate environment.
    Date: 2026–01
    URL: https://d.repec.org/n?u=RePEc:arx:papers:2601.05716
  19. By: Chellai, Fatih
    Abstract: Forecast accuracy evaluation is a cornerstone in fields as diverse as finance, public health, energy, and meteorology. However, traditional reliance on single-error metrics—such as MAE, RMSE, or MAPE—offers only a fragmented view of a model’s performance, often obscuring critical dimensions like systematic bias, volatility, directional behavior, or shape fidelity. To overcome these limitations, this study proposes the Unified Forecast Accuracy Index (UFAI), a multidimensional and composite metric that consolidates several facets of forecasting quality into a single, interpretable score. UFAI integrates four normalized sub-indices—bias, variance, directional accuracy, and shape preservation—each capturing a distinct performance characteristic. The framework accommodates multiple weighting schemes: equal weighting for simplicity, expert-informed weighting to reflect domain-specific priorities, and data-driven weighting based on statistical principles such as Principal Component Analysis and entropy measures. This flexibility enables users to adapt the index to diverse forecasting objectives and application contexts. The article details the mathematical formulation of each sub-index, discusses the theoretical soundness and practical implications of different weighting strategies, and demonstrates the utility of UFAI through comparative model evaluations. Emphasis is placed on the index’s normalization, interpretability, robustness to outliers, and extensibility to future use cases such as multi-horizon and probabilistic forecasts. By offering a more integrated and context-aware assessment tool, the UFAI marks a significant advancement in forecast evaluation methodology, supporting more reliable model selection and ultimately enhancing decision-making in data-driven environments.
    Keywords: Forecast evaluation, Unified forecast accuracy index, Composite metrics, Model comparison
    JEL: C1 C2 C4
    Date: 2025–12–23
    URL: https://d.repec.org/n?u=RePEc:pra:mprapa:127449

This nep-ets issue is ©2026 by Simon Sosvilla-Rivero. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at https://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.