S$^2$M-Former: Spiking Symmetric Mixing Branchformer for Brain Auditory Attention Detection

arXiv — cs.LGWednesday, November 12, 2025 at 5:00:00 AM
The S$^2$M-Former represents a significant advancement in auditory attention detection (AAD), a field crucial for developing neuro-steered hearing devices. By leveraging a spiking symmetric architecture with parallel spatial and frequency branches, it enhances the complementary learning of EEG features. The introduction of lightweight 1D token sequences allows for a remarkable 14.7 times reduction in parameters, while the brain-inspired design achieves a 5.8 times reduction in energy consumption compared to recent artificial neural network (ANN) methods. Comprehensive experiments on three AAD benchmarks demonstrate that S$^2$M-Former not only excels in energy efficiency but also achieves comparable state-of-the-art decoding accuracy, marking a pivotal step forward in the application of EEG technology in complex auditory environments.
— via World Pulse Now AI Editorial System

Was this article worth reading? Share it

Ready to build your own newsroom?

Subscribe to unlock a personalised feed, podcasts, newsletters, and notifications tailored to the topics you actually care about