Stabilizing Direct Training of Spiking Neural Networks: Membrane Potential Initialization and Threshold-robust Surrogate Gradient

arXiv — cs.CVThursday, November 13, 2025 at 5:00:00 AM
The recent paper on stabilizing the direct training of Spiking Neural Networks (SNNs) presents two key innovations: Membrane Potential Initialization (MP-Init) and Threshold-robust Surrogate Gradient (TrSG). These advancements tackle persistent challenges in SNNs, such as temporal covariate shift (TCS) and unstable gradient flow, which hinder effective training. By aligning the initial membrane potential with its stationary distribution, MP-Init mitigates TCS, while TrSG stabilizes gradient flow concerning neuron thresholds. Extensive experiments validate these methods, demonstrating state-of-the-art accuracy on both static and dynamic image datasets. This research not only enhances the performance of SNNs but also paves the way for novel energy-efficient AI paradigms, marking a significant step forward in the field of artificial intelligence.
— via World Pulse Now AI Editorial System

Was this article worth reading? Share it

Recommended apps based on your readingExplore all apps
Continue Readings
Sleep-Based Homeostatic Regularization for Stabilizing Spike-Timing-Dependent Plasticity in Recurrent Spiking Neural Networks
NeutralArtificial Intelligence
A new study proposes a sleep-based homeostatic regularization scheme to stabilize spike-timing-dependent plasticity (STDP) in recurrent spiking neural networks (SNNs). This approach aims to mitigate issues such as unbounded weight growth and catastrophic forgetting by introducing offline phases where synaptic weights decay towards a homeostatic baseline, enhancing memory consolidation.

Ready to build your own newsroom?

Subscribe to unlock a personalised feed, podcasts, newsletters, and notifications tailored to the topics you actually care about