|
on Econometric Time Series |
Issue of 2025–09–08
seventeen papers chosen by Simon Sosvilla-Rivero, Instituto Complutense de Análisis Económico |
By: | Monica Billio (Ca’ Foscari University of Venice; Venice centre in Economic and Risk Analytics); Roberto Casarin (European Centre for Living Technology; Venice centre in Economic and Risk Analytics); Fausto Corradin (Ca’ Foscari University of Venice); Antonio Peruzzi (Ca’ Foscari University of Venice) |
Abstract: | Bayes Factor (BF) is one of the tools used in Bayesian analysis for model selection. The predictive BF finds application in detecting outliers, which are relevant sources of estimation and forecast errors. An efficient framework for outlier detection is provided and purposely designed for large multidimensional datasets. Online detection and analytical tractability guarantee the procedure's efficiency. The proposed sequential Bayesian monitoring extends the univariate setup to a matrix–variate one. Prior perturbation based on power discounting is applied to obtain tractable predictive BFs. This way, computationally intensive procedures used in Bayesian Analysis are not required. The conditions leading to inconclusive responses in outlier identification are derived, and some robust approaches are proposed that exploit the predictive BF's variability to improve the standard discounting method. The effectiveness of the procedure is studied using simulated data. An illustration is provided through applications to relevant benchmark datasets from macroeconomics and finance. |
Keywords: | Bayesian Modelling, Bayes Factor, Sequential Model Assessment, Outliers |
JEL: | C11 C22 E31 F10 |
Date: | 2025 |
URL: | https://d.repec.org/n?u=RePEc:ven:wpaper:2025:14 |
By: | Mr. Sakai Ando; Shuvam Das; Sultan Orazbayev |
Abstract: | In forecasting economic time series, statistical models often need to be complemented with a process to impose various constraints in a smooth manner. Systematically imposing constraints and retaining smoothness are important but challenging. Ando (2024) proposes a systematic approach, but a user-friendly package to implement it has not been developed. This paper addresses this gap by introducing a Python package, macroframe-forecast, that allows users to generate forecasts that are both smooth over time and consistent with user-specified constraints. We demonstrate the package’s functionality with two examples about forecasting US GDP and fiscal variables. |
Keywords: | Forecast Reconciliation; Python Package; Macroframework |
Date: | 2025–08–29 |
URL: | https://d.repec.org/n?u=RePEc:imf:imfwpa:2025/172 |
By: | Antoni Espasa; Guillermo Carlomagno |
Abstract: | This paper explores the challenges of modelling high-frequency, non-financial big data time-series. Focusing on daily, hourly, and even minute-level data, the study in-vestigates the presence of various seasonalities (daily, weekly, monthly, and annual) and how these cycles might interrelate between them and be influenced by weather patterns and calendar variations. By analyzing these cyclical characteristics and data responses to external factors, the paper explores the potential for regimeswitching, dynamic, and non-linear models to capture these complexities. Furthermore, it proposes the use of Autometrics –an automated algorithm for identifying parsimonious models– to jointly account for all the data’s peculiarities. The resulting models, beyond structural anal-ysis and forecasting, are useful for constructing real-time quantitative macroeconomic leading indicators, demand planning and dynamic pricing strategies in various sectors that are sensitive to the factors identified in the analysis (e.g., of utilities, retail stores, traffic, or labor market indicators). The paper includes an application to the daily series of jobless claims in Chile. |
Date: | 2024–10 |
URL: | https://d.repec.org/n?u=RePEc:chb:bcchwp:1023 |
By: | Tingting Cheng; Jiachen Cong; Fei Liu; Xuanbin Yang |
Abstract: | In this paper, we propose a novel factor-augmented forecasting regression model with a binary response variable. We develop a maximum likelihood estimation method for the regression parameters and establish the asymptotic properties of the resulting estimators. Monte Carlo simulation results show that the proposed estimation method performs very well in finite samples. Finally, we demonstrate the usefulness of the proposed model through an application to U.S. recession forecasting. The proposed model consistently outperforms conventional Probit regression across both in-sample and out-of-sample exercises, by effectively utilizing high-dimensional information through latent factors. |
Date: | 2025–07 |
URL: | https://d.repec.org/n?u=RePEc:arx:papers:2507.16462 |
By: | Atika Aouri; Philipp Otto |
Abstract: | We introduce a heterogeneous spatiotemporal GARCH model for geostatistical data or processes on networks, e.g., for modelling and predicting financial return volatility across firms in a latent spatial framework. The model combines classical GARCH(p, q) dynamics with spatially correlated innovations and spatially varying parameters, estimated using local likelihood methods. Spatial dependence is introduced through a geostatistical covariance structure on the innovation process, capturing contemporaneous cross-sectional correlation. This dependence propagates into the volatility dynamics via the recursive GARCH structure, allowing the model to reflect spatial spillovers and contagion effects in a parsimonious and interpretable way. In addition, this modelling framework allows for spatial volatility predictions at unobserved locations. In an empirical application, we demonstrate how the model can be applied to financial stock networks. Unlike other spatial GARCH models, our framework does not rely on a fixed adjacency matrix; instead, spatial proximity is defined in a proxy space constructed from balance sheet characteristics. Using daily log returns of 50 publicly listed firms over a one-year period, we evaluate the model's predictive performance in a cross-validation study. |
Date: | 2025–08 |
URL: | https://d.repec.org/n?u=RePEc:arx:papers:2508.20101 |
By: | Gaurav Singh |
Abstract: | This study analyzes and forecasts daily passenger counts for New York City's iconic yellow taxis during 2017-2019, a period of significant decline in ridership. Using a comprehensive dataset from the NYC Taxi and Limousine Commission, we employ various time series modeling approaches, including ARIMA models, to predict daily passenger volumes. Our analysis reveals strong seasonal patterns, with a consistent linear decline of approximately 200 passengers per day throughout the study period. After comparing multiple modeling approaches, we find that a first-order autoregressive model, combined with careful detrending and cycle removal, provides the most accurate predictions, achieving a test RMSE of 34, 880 passengers on a mean ridership of 438, 000 daily passengers. The research provides valuable insights for policymakers and stakeholders in understanding and potentially addressing the declining trajectory of NYC's yellow taxi service. |
Date: | 2025–07 |
URL: | https://d.repec.org/n?u=RePEc:arx:papers:2507.10588 |
By: | Klieber, Karin; Coulombe, Philippe Goulet |
Abstract: | Local projections (LPs) are widely used in empirical macroeconomics to estimate impulse responses to policy interventions. Yet, in many ways, they are black boxes. It is often unclear what mechanism or historical episodes drive a particular estimate. We introduce a new decomposition of LP estimates into the sum of contributions of historical events, which is the product, for each time stamp, of a weight and the realization of the response variable. In the least squares case, we show that these weights admit two interpretations. First, they represent purified and standardized shocks. Second, they serve as proximity scores between the projected policy intervention and past interventions in the sample. Notably, this second interpretation extends naturally to machine learning methods, many of which yield impulse responses that, while nonlinear in predictors, still aggregate past outcomes linearly via proximity-based weights. Applying this framework to shocks in monetary and fiscal policy, global temperature, and the excess bond premium, we find that easily identifiable events—such as Nixon’s interference with the Fed, stagflation, World War II, and the Mount Agung volcanic eruption—emerge as dominant drivers of oftenheavily concentrated impulse response estimates. JEL Classification: C32, C53, E31, E52, E62 |
Keywords: | climate, financial shocks, fiscal multipliers, local projections, monetary policy |
Date: | 2025–08 |
URL: | https://d.repec.org/n?u=RePEc:ecb:ecbwps:20253105 |
By: | Freddy Garc\'ia-Alb\'an; Juan Jarr\'in |
Abstract: | This paper develops a high-frequency economic indicator using a Bayesian Dynamic Factor Model estimated with mixed-frequency data. The model incorporates weekly, monthly, and quarterly official indicators, and allows for dynamic heterogeneity and stochastic volatility. To ensure temporal consistency and avoid irregular aggregation artifacts, we introduce a pseudo-week structure that harmonizes the timing of observations. Our framework integrates dispersed and asynchronous official statistics into a unified High-Frequency Economic Index (HFEI), enabling real-time economic monitoring even in environments characterized by severe data limitations. We apply this framework to construct a high-frequency indicator for Ecuador, a country where official data are sparse and highly asynchronous, and compute pseudo-weekly recession probabilities using a time-varying mean regime-switching model fitted to the resulting index. |
Date: | 2025–07 |
URL: | https://d.repec.org/n?u=RePEc:arx:papers:2507.07450 |
By: | Rodrigo Alfaro; Catalina Estefó |
Abstract: | In this paper we propose a framework for building tail-risk indicators for the Chilean Peso (CLP) based on time-variant volatility models [e.g., Engle (1982), Taylor (1982), Nelson (1991), Heston and Nandi (2000)], which we estimate by combining: (i) daily returns, (ii) option-implied volatility (IV), and (iii) intraday realized volatility (RV). Our empirical results show that the in-sample fit of the models improves when volatility measures (IV or RV) are added. We provide an application of the framework to evaluate extreme scenarios. |
Date: | 2025–04 |
URL: | https://d.repec.org/n?u=RePEc:chb:bcchwp:1041 |
By: | Giuseppe Cavaliere; Luca Fanelli; Iliyan Georgiev |
Abstract: | Violation of the assumptions underlying classical (Gaussian) limit theory frequently leads to unreliable statistical inference. This paper shows the novel result that the bootstrap can detect such violation by means of simple and powerful tests which (a) induce no pre-testing bias, (b) can be performed using the same critical values in a broad range of applications, and (c) are consistent against deviations from asymptotic normality. By focusing on the discrepancy between the conditional distribution of a bootstrap statistic and the (limiting) Gaussian distribution which obtains under valid specification, we show how to assess whether this discrepancy is large enough to indicate specification invalidity. The method, which is computationally straightforward, only requires to measure the discrepancy between the bootstrap and the Gaussian distributions based on a sample of i.i.d. draws of the bootstrap statistic. We derive sufficient conditions for the randomness in the data to mix with the randomness in the bootstrap repetitions in a way such that (a), (b) and (c) above hold. To demonstrate the practical relevance and broad applicability of our diagnostic procedure, we discuss five scenarios where the asymptotic Gaussian approximation may fail: (i) weak instruments in instrumental variable regression; (ii) non-stationarity in autoregressive time series; (iii) parameters near or at the boundary of the parameter space; (iv) infinite variance innovations in a location model for i.i.d. data; (v) invalidity of the delta method due to (near-)rank deficiency in the implied Jacobian matrix. An illustration drawn from the empirical macroeconomic literature concludes. |
Date: | 2025–09 |
URL: | https://d.repec.org/n?u=RePEc:arx:papers:2509.01351 |
By: | Susie McKenzie (The Treasury) |
Abstract: | This paper explores the use of vector autoregressive (VAR) models to supplement the New Zealand Treasury’s tax forecasting models. The models are used to forecast both tax revenue and tax receipts. A suite of VAR models is developed for 20 different tax types with a focus on assessing the forecasting performance of six model specifications for each tax category. This paper shows that VAR models exhibit strong predictive performance for tax types with stable trends, such as total tax and source deductions. By contrast, models for corporate tax and other persons tax exhibit higher volatility and larger discrepancies. Several challenges were identified with these models. One challenge is that it is difficult to accommodate changes in tax rates through the sample period. A second challenge is that large shocks, such as the COVID-19 pandemic, introduce significant volatility and affect the accuracy of forecasts, particularly for tax receipts. Some model specifications also exhibit biases in their predictions for certain tax types. Comparing the forecasts to the official data release for 2024Q3, the VAR models for 13 out of 20 tax types produced forecasts within the range of the official tax release, while 7 tax types had discrepancies between $0.7 billion and $3.2 billion, with the largest discrepancies arising in tax receipts forecasts for total, indirect, and GST taxes. |
JEL: | C53 E62 H20 C22 |
Date: | 2025–07–03 |
URL: | https://d.repec.org/n?u=RePEc:nzt:nztans:an25/03 |
By: | Yushi YOSHIDA |
Abstract: | People perceive the same level of nominal exchange rate as overvalued at one point in time and undervalued at a different point in time. To capture the perception of the exchange rate at specific times, we suggest constructing the perceived exchange rate by counting the newspaper articles with phrases ’appreciated currency’ or ’depreciated currency.’ A shift in the perceived exchange rate (PER) index alters the dynamic response of exchange rates in time series. The PER index is a valid threshold variable in forecasting future exchange rates. The forecast model with the PER index as a threshold variable (PER TAR) outperforms models utilizing the lagged exchange rates as a threshold variable. We also show that the forecast precision of the PER TAR model is as good as the survey forecasts by market participants. |
Date: | 2025–08 |
URL: | https://d.repec.org/n?u=RePEc:eti:dpaper:25079 |
By: | Leonardo Bargigli |
Abstract: | Using a novel Generalized Autoregressive Score (GAS) methodology applied to EUR/USD high-frequency interdealer data on price variations and net demand in 2016, this paper provides evidence of a substantial violation of market efficiency in the foreign exchange market. The analysis shows that endogenous factors amplified efficient price fluctuations by at least 46\% on average, underscoring the importance of informational asymmetry and feedback trading in exchange rate dynamics. The key implication is that excess volatility of the EUR/USD exchange rate is not only sizeable but also structural, as it arises from mechanisms intrinsic to market functioning. |
Keywords: | excess volatility, foreign exchange, high frequency data, score-driven model, GARCH, SVAR. |
JEL: | G14 C32 C58 F31 |
Date: | 2025 |
URL: | https://d.repec.org/n?u=RePEc:frz:wpaper:wp2025_13.rdf |
By: | Gian Pietro Bellocca; Ignacio Garr\'on; Vladimir Rodr\'iguez-Caballero; Esther Ruiz |
Abstract: | Obtaining realistic scenarios for the distribution of key economic variables is crucial for econometricians, policy-makers, and financial analysts. The FARS package provides a comprehensive framework in R for modeling and designing economic scenarios based on distributions derived from multi-level dynamic factor models (ML-DFMs) and factor-augmented quantile regressions (FA-QRs). The package enables users to: (i) extract global and block-specific factors using a flexible multi-level factor structure; (ii) compute asymptotically valid confidence regions for the estimated factors, accounting for uncertainty in the factor loadings; (iii) estimate FA-QRs; (iv) recover full predictive conditional densities from quantile forecasts; and (v) estimate the conditional density when the factors are stressed. |
Date: | 2025–07 |
URL: | https://d.repec.org/n?u=RePEc:arx:papers:2507.10679 |
By: | Zhuohang Zhu; Haodong Chen; Qiang Qu; Vera Chung |
Abstract: | Financial time-series forecasting is critical for maintaining economic stability, guiding informed policymaking, and promoting sustainable investment practices. However, it remains challenging due to various underlying pattern shifts. These shifts arise primarily from three sources: temporal non-stationarity (distribution changes over time), multi-domain diversity (distinct patterns across financial domains such as stocks, commodities, and futures), and varying temporal resolutions (patterns differing across per-second, hourly, daily, or weekly indicators). While recent deep learning methods attempt to address these complexities, they frequently suffer from overfitting and typically require extensive domain-specific fine-tuning. To overcome these limitations, we introduce FinCast, the first foundation model specifically designed for financial time-series forecasting, trained on large-scale financial datasets. Remarkably, FinCast exhibits robust zero-shot performance, effectively capturing diverse patterns without domain-specific fine-tuning. Comprehensive empirical and qualitative evaluations demonstrate that FinCast surpasses existing state-of-the-art methods, highlighting its strong generalization capabilities. |
Date: | 2025–08 |
URL: | https://d.repec.org/n?u=RePEc:arx:papers:2508.19609 |
By: | James W. Taylor; Chao Wang |
Abstract: | Value-at-risk (VaR) and expected shortfall (ES) have become widely used measures of risk for daily portfolio returns. As a result, many methods now exist for forecasting the VaR and ES. These include GARCH-based modelling, approaches involving quantile-based autoregressive models, and methods incorporating measures of realised volatility. When multiple forecasting methods are available, an alternative to method selection is forecast combination. In this paper, we consider the combination of a large pool of VaR and ES forecasts. As there have been few studies in this area, we implement a variety of new combining methods. In terms of simplistic methods, in addition to the simple average, the large pool of forecasts leads us to use the median and mode. As a complement to the previously proposed performance-based weighted combinations, we use regularised estimation to limit the risk of overfitting due to the large number of weights. By viewing the forecasts of VaR and ES from each method as the bounds of an interval forecast, we are able to apply interval forecast combining methods from the decision analysis literature. These include different forms of trimmed mean, and a probability averaging method that involves a mixture of the probability distributions inferred from the VaR and ES forecasts. Among other methods, we consider smooth transition between two combining methods. Using six stock indices and a pool of 90 individual forecasting methods, we obtained particularly strong results for a trimmed mean approach, the probability averaging method, and performance-based weighting combining. |
Date: | 2025–08 |
URL: | https://d.repec.org/n?u=RePEc:arx:papers:2508.16919 |
By: | Cecilia Mancini |
Abstract: | We describe a Matlab routine that allows us to estimate the jumps in financial asset prices using the Threshold (or Truncation) method of Mancini (2009). The routine is designed for application to five-minute log-returns. The underlying assumption is that asset prices evolve in time following an Ito semimartingale with, possibly stochastic, volatility and jumps. A log-return is likely to contain a jump if its absolute value is larger than a threshold determined by the maximum increment of the Brownian semimartingale part. The latter is particularly sensitive to the magnitude of the volatility coefficient, and from an empirical point of view, volatility levels typically depend on the time of day (TOD), with volatility being highest at the beginning and end of the day, while it is low in the middle. The first routine presented allows for an estimation of the TOD effect, and is an implementation of the method described in Bollerslev and Todorov (2011). Subsequently, the TOD effect for the stock Apple Inc. (AAPL) is visualized. The second routine presented is an implementation of the threshold method for estimating jumps in AAPL prices. The procedure recursively estimates daily volatility and jumps. In each round, the threshold depends on the time of the day and is constructed using the estimate of the daily volatility multiplied by the daytime TOD factor and by the continuity modulus of the Brownian motion paths. Once the jumps are detected, the daily volatility estimate is updated using only the log-returns not containing jumps. Before application to empirical data, the reliability of the procedure was separately tested on simulated asset prices. The results obtained on a record of AAPL stock prices are visualized. |
Date: | 2025–08 |
URL: | https://d.repec.org/n?u=RePEc:arx:papers:2508.18876 |