Enhancing DPSGD via Per-Sample Momentum and Low-Pass Filtering

arXiv — cs.LGThursday, November 13, 2025 at 5:00:00 AM
The recent submission of the paper 'Enhancing DPSGD via Per-Sample Momentum and Low-Pass Filtering' on arXiv presents a novel approach to improving Differentially Private Stochastic Gradient Descent (DPSGD), a method widely used for training deep neural networks while ensuring privacy. Traditional implementations of DPSGD often suffer from reduced accuracy due to the introduction of noise and bias. The proposed DP-PMLF method effectively mitigates these issues by combining per-sample momentum with a low-pass filtering strategy, which smooths gradient estimates and reduces sampling variance. The theoretical analysis provided in the paper indicates an improved convergence rate while maintaining rigorous differential privacy guarantees. Empirical evaluations further demonstrate that DP-PMLF significantly enhances the balance between privacy and utility compared to existing state-of-the-art DPSGD variants. This advancement is crucial for the ongoing development of privacy-preserving machin…
— via World Pulse Now AI Editorial System

Was this article worth reading? Share it

Recommended apps based on your readingExplore all apps
Continue Readings
Towards A Unified PAC-Bayesian Framework for Norm-based Generalization Bounds
NeutralArtificial Intelligence
A new study proposes a unified PAC-Bayesian framework for norm-based generalization bounds, addressing the challenges of understanding deep neural networks' generalization behavior. The research reformulates the derivation of these bounds as a stochastic optimization problem over anisotropic Gaussian posteriors, aiming to enhance the practical relevance of the results.
A Statistical Assessment of Amortized Inference Under Signal-to-Noise Variation and Distribution Shift
NeutralArtificial Intelligence
A recent study has assessed the effectiveness of amortized inference in Bayesian statistics, particularly under varying signal-to-noise ratios and distribution shifts. This method leverages deep neural networks to streamline the inference process, allowing for significant computational savings compared to traditional Bayesian approaches that require extensive likelihood evaluations.

Ready to build your own newsroom?

Subscribe to unlock a personalised feed, podcasts, newsletters, and notifications tailored to the topics you actually care about