FedSDWC: Federated Synergistic Dual-Representation Weak Causal Learning for OOD
PositiveArtificial Intelligence
FedSDWC has been proposed as a solution to the challenges faced by federated learning (FL), particularly the issues arising from differences in data distribution that affect reliability. This new causal inference method integrates both invariant and variant features, which allows it to effectively capture causal representations and enhance FL's generalization capabilities. Extensive experiments demonstrate that FedSDWC outperforms existing methods, such as FedICON, by notable margins on benchmark datasets like CIFAR-10 and CIFAR-100. The theoretical foundation of FedSDWC includes a derived generalization error bound under specific conditions, establishing its relationship with client prior distributions. This advancement is significant as it not only improves the performance of FL but also addresses critical concerns regarding data privacy and the reliability of distributed learning systems in real-world applications.
— via World Pulse Now AI Editorial System
