CID: Measuring Feature Importance Through Counterfactual Distributions
PositiveArtificial Intelligence
- A new method for assessing feature importance in Machine Learning, called Counterfactual Importance Distribution (CID), has been introduced. This post-hoc local feature importance method generates positive and negative counterfactuals, models their distributions using Kernel Density Estimation, and ranks features based on a distributional dissimilarity measure, enhancing the understanding of model decision-making processes.
- The development of CID is significant as it provides a rigorous mathematical framework for measuring feature importance, addressing the limitations of existing methods. By offering complementary perspectives and improving performance on faithful explanations, CID could advance the interpretability of Machine Learning models.
— via World Pulse Now AI Editorial System
