Mixture of Message Passing Experts with Routing Entropy Regularization for Node Classification
PositiveArtificial Intelligence
Graph neural networks (GNNs) have made significant strides in graph-based learning tasks, but they often struggle with heterophilous structures where connected nodes differ in features and labels. To tackle this challenge, researchers introduced GNNMoE, a novel framework that employs a mixture of message-passing experts combined with routing entropy regularization. This approach allows for adaptive representation learning at the node level. GNNMoE separates message passing into propagation and transformation operations, integrating them through expert networks guided by a hybrid routing mechanism. The routing entropy regularization dynamically adjusts soft weighting and top-k routing, enabling the model to flexibly adapt to various neighborhood contexts. Extensive experiments conducted on twelve benchmark datasets demonstrate that GNNMoE consistently outperforms state-of-the-art node classification methods while maintaining scalability and interpretability. This advancement represents …
— via World Pulse Now AI Editorial System
