N-ReLU: Zero-Mean Stochastic Extension of ReLU

arXiv — cs.LGWednesday, November 12, 2025 at 5:00:00 AM
N-ReLU, a newly introduced activation function, offers a solution to the problem of dead neurons commonly associated with the standard ReLU. By incorporating zero-mean Gaussian noise to replace negative activations, N-ReLU preserves the expected output while improving gradient flow in inactive regions. This innovative approach was tested on the MNIST dataset using both MLP and CNN architectures, demonstrating accuracy that meets or slightly exceeds that of existing functions like LeakyReLU and GELU at moderate noise levels. Notably, N-ReLU ensures stable convergence and eliminates dead neurons, showcasing its effectiveness as a lightweight mechanism to enhance optimization robustness without altering network structures or adding parameters. This advancement is significant for the field of artificial intelligence, as it provides a straightforward yet powerful tool for improving the performance of deep learning models.
— via World Pulse Now AI Editorial System

Was this article worth reading? Share it

Ready to build your own newsroom?

Subscribe to unlock a personalised feed, podcasts, newsletters, and notifications tailored to the topics you actually care about