FedPM: Federated Learning Using Second-order Optimization with Preconditioned Mixing of Local Parameters

arXiv — cs.LGThursday, November 13, 2025 at 5:00:00 AM
The introduction of Federated Preconditioned Mixing (FedPM) marks a significant advancement in Federated Learning (FL), addressing critical issues faced by prior methods like LocalNewton, LTDA, and FedSophia, which struggled with drift in local preconditioners that disrupted convergence. By refining update rules and implementing preconditioned mixing of local parameters on the server, FedPM effectively mitigates these issues, resulting in improved test accuracy. The theoretical convergence analysis indicates a superlinear rate for strongly convex objectives, showcasing the method's potential in heterogeneous data settings. Extensive experiments have validated these claims, demonstrating significant improvements in performance compared to conventional methods. This development is crucial as it enhances the reliability and efficiency of FL, which is increasingly important in various applications where data privacy and decentralized learning are paramount.
— via World Pulse Now AI Editorial System

Was this article worth reading? Share it

Recommended apps based on your readingExplore all apps
Continue Readings
Accelerated Methods with Complexity Separation Under Data Similarity for Federated Learning Problems
NeutralArtificial Intelligence
A recent study has formalized the challenges posed by heterogeneity in data distribution within federated learning tasks as an optimization problem, proposing several communication-efficient methods and an optimal algorithm for the convex case. The theory has been validated through experiments across various problems.

Ready to build your own newsroom?

Subscribe to unlock a personalised feed, podcasts, newsletters, and notifications tailored to the topics you actually care about