FedPM: Federated Learning Using Second-order Optimization with Preconditioned Mixing of Local Parameters
PositiveArtificial Intelligence
The introduction of Federated Preconditioned Mixing (FedPM) marks a significant advancement in Federated Learning (FL), addressing critical issues faced by prior methods like LocalNewton, LTDA, and FedSophia, which struggled with drift in local preconditioners that disrupted convergence. By refining update rules and implementing preconditioned mixing of local parameters on the server, FedPM effectively mitigates these issues, resulting in improved test accuracy. The theoretical convergence analysis indicates a superlinear rate for strongly convex objectives, showcasing the method's potential in heterogeneous data settings. Extensive experiments have validated these claims, demonstrating significant improvements in performance compared to conventional methods. This development is crucial as it enhances the reliability and efficiency of FL, which is increasingly important in various applications where data privacy and decentralized learning are paramount.
— via World Pulse Now AI Editorial System
