A Practical Introduction to Kernel Discrepancies: MMD, HSIC & KSD

arXiv — cs.LGMonday, November 3, 2025 at 5:00:00 AM
This article serves as a practical introduction to kernel discrepancies, specifically focusing on Maximum Mean Discrepancy (MMD), Hilbert-Schmidt Independence Criterion (HSIC), and Kernel Stein Discrepancy (KSD). It discusses various estimators, including V-statistics and U-statistics, along with more efficient incomplete U-statistics. Understanding these concepts is crucial for researchers and practitioners in statistics and machine learning, as they provide essential tools for measuring differences between probability distributions.
— via World Pulse Now AI Editorial System

Was this article worth reading? Share it

Recommended apps based on your readingExplore all apps
Continue Readings
Stein Discrepancy for Unsupervised Domain Adaptation
PositiveArtificial Intelligence
A novel framework for unsupervised domain adaptation (UDA) has been proposed, leveraging Stein discrepancy, an asymmetric measure that focuses on the target distribution's score function. This approach aims to enhance model performance in scenarios where target data is limited, addressing a significant challenge in UDA methodologies that typically rely on symmetric measures like maximum mean discrepancy (MMD).