|
on Econometric Time Series |
By: | Mr. Sakai Ando |
Abstract: | How to make forecasts that (1) satisfy constraints, like accounting identities, and (2) are smooth over time? Solving this common forecasting problem manually is resource-intensive, but the existing literature provides little guidance on how to achieve both objectives. This paper proposes a new method to smooth mixed-frequency multivariate time series subject to constraints by integrating the minimum-trace reconciliation and Hodrick-Prescott filter. With linear constraints, the method has a closed-form solution, convenient for a high-dimensional environment. Three examples show that the proposed method can reproduce the smoothness of professional forecasts subject to various constraints and slightly improve forecast performance. |
Keywords: | Smoothness; Forecast Reconciliation; Minimum Trace Reconciliation; Hodrick-Prescott filter; Cross-sectional; Temporal |
Date: | 2024–03–22 |
URL: | http://d.repec.org/n?u=RePEc:imf:imfwpa:2024/066&r=ets |
By: | Dai, Yongsheng; Wang, Hui; Rafferty, Karen; Spence, Ivor; Quinn, Barry |
Abstract: | Time series anomaly detection plays a critical role in various applications, from finance to industrial monitoring. Effective models need to capture both the inherent characteristics of time series data and the unique patterns associated with anomalies. While traditional forecasting-based and reconstruction-based approaches have been successful, they tend to struggle with complex and evolving anomalies. For instance, stock market data exhibits complex and ever-changing fluctuation patterns that defy straightforward modelling. In this paper, we propose a novel approach called TDSRL (Time Series Dual Self-Supervised Representation Learning) for robust anomaly detection. TDSRL leverages synthetic anomaly segments which are artificially generated to simulate real-world anomalies. The key innovation lies in dual self-supervised pretext tasks: one task characterises anomalies in relation to the entire time series, while the other focuses on local anomaly boundaries. Additionally, we introduce a data degradation method that operates in both the time and frequency domains, creating a more natural simulation of real-world anomalies compared to purely synthetic data. Consequently, TDSRL is expected to achieve more accurate predictions of the location and extent of anomalous segments. Our experiments demonstrate that TDSRL outperforms state-of-the-art methods, making it a promising avenue for time series anomaly detection. |
Keywords: | Time series anomaly detection, self-supervised representation learning, contrastive learning, synthetic anomaly |
Date: | 2024 |
URL: | http://d.repec.org/n?u=RePEc:zbw:qmsrps:202403&r=ets |
By: | Mikkel Bennedsen; Kim Christensen; Peter Christensen |
Abstract: | We develop a framework for composite likelihood inference of parametric continuous-time stationary Gaussian processes. We derive the asymptotic theory of the associated maximum composite likelihood estimator. We implement our approach on a pair of models that has been proposed to describe the random log-spot variance of financial asset returns. A simulation study shows that it delivers good performance in these settings and improves upon a method-of-moments estimation. In an application, we inspect the dynamic of an intraday measure of spot variance computed with high-frequency data from the cryptocurrency market. The empirical evidence supports a mechanism, where the short- and long-term correlation structure of stochastic volatility are decoupled in order to capture its properties at different time scales. |
Date: | 2024–03 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:2403.12653&r=ets |
By: | Babel Raïssa Guemdjo Kamdem (Advanced School of Economics and Commerce, University of Douala); Jules Sadefo Kamdem (MRM - Montpellier Research in Management - UPVD - Université de Perpignan Via Domitia - UM - Université de Montpellier); Carlos Ogouyandjou (IMSP - Institut de Mathématiques et de Sciences Physiques - UAC - Université d’Abomey-Calavi = University of Abomey Calavi) |
Abstract: | An extended interval is a range A = [A, A] where A may be bigger than A. This is not really natural, but is what has been used as the definition of an extended interval so far. In the present work we introduce a new, natural, and very intuitive way to see an extended interval. From now on, an extended interval is a subset of the Cartesian product R × Z2, where Z2 = {0, 1} is the set of directions; the direction 0 is for increasing intervals, and the direction 1 for decreasing ones. For instance, [3, 6] × {1} is the decreasing version of [6, 3]. Thereafter, we introduce on the set of extended intervals a family of metrics dγ, depending on a function γ(t), and show that there exists a unique metric dγ for which γ(t)dt is what we have called an "adapted measure". This unique metric has very good properties, is simple to compute, and has been implemented in the software R. Furthermore, we use this metric to define variability for random extended intervals. We further study extended interval-valued ARMA time series and prove the Wold decomposition theorem for stationary extended interval-valued times series. |
Keywords: | Random set, Random extended interval, Distance, Measure, Time series |
Date: | 2024–03–13 |
URL: | http://d.repec.org/n?u=RePEc:hal:journl:hal-04506343&r=ets |