Continuous Subspace Optimization for Continual Learning

arXiv — cs.LGWednesday, November 12, 2025 at 5:00:00 AM
The publication of Continuous Subspace Optimization for Continual Learning (CoSO) introduces a novel approach to tackle catastrophic forgetting, a common issue in continual learning where models struggle to retain knowledge from previous tasks. Traditional methods often use low-rank adaptation, which limits parameter updates and can hinder performance. CoSO overcomes this by optimizing models across multiple subspaces, determined through singular value decomposition of gradients. This method not only enhances memory efficiency but also constrains the optimization subspace for each task to be orthogonal to previous tasks, thereby mitigating forgetting. The significance of CoSO lies in its potential to improve the adaptability of machine learning models, making them more effective in real-world applications where sequential learning is essential.
— via World Pulse Now AI Editorial System

Was this article worth reading? Share it

Recommended apps based on your readingExplore all apps

Ready to build your own newsroom?

Subscribe to unlock a personalised feed, podcasts, newsletters, and notifications tailored to the topics you actually care about