Repetitive Contrastive Learning Enhances Mamba's Selectivity in Time Series Prediction

arXiv — cs.LGThursday, November 13, 2025 at 5:00:00 AM
The introduction of Repetitive Contrastive Learning (RCL) marks a significant advancement in time series forecasting, particularly for Mamba-based models that have previously excelled due to their sequence selection capabilities. However, these models faced challenges with insufficient focus on critical time steps and noise suppression. RCL addresses these issues by pretraining a Mamba block to enhance its selective abilities, which are then transferred to various backbone models. This innovative approach employs sequence augmentation with Gaussian noise and utilizes both inter-sequence and intra-sequence contrastive learning to prioritize information-rich time steps. Extensive experiments have demonstrated that RCL not only improves the temporal prediction performance of these models but also surpasses existing methods, achieving state-of-the-art results. Additionally, two new metrics have been proposed to quantify Mamba's selective capabilities, further solidifying the impact of RCL …
— via World Pulse Now AI Editorial System

Was this article worth reading? Share it

Recommended apps based on your readingExplore all apps

Ready to build your own newsroom?

Subscribe to unlock a personalised feed, podcasts, newsletters, and notifications tailored to the topics you actually care about