S$^2$M-Former: Spiking Symmetric Mixing Branchformer for Brain Auditory Attention Detection
PositiveArtificial Intelligence
The S$^2$M-Former represents a significant advancement in auditory attention detection (AAD), a field crucial for developing neuro-steered hearing devices. By leveraging a spiking symmetric architecture with parallel spatial and frequency branches, it enhances the complementary learning of EEG features. The introduction of lightweight 1D token sequences allows for a remarkable 14.7 times reduction in parameters, while the brain-inspired design achieves a 5.8 times reduction in energy consumption compared to recent artificial neural network (ANN) methods. Comprehensive experiments on three AAD benchmarks demonstrate that S$^2$M-Former not only excels in energy efficiency but also achieves comparable state-of-the-art decoding accuracy, marking a pivotal step forward in the application of EEG technology in complex auditory environments.
— via World Pulse Now AI Editorial System
