Human-centric photo dataset aims to help spot AI biases responsibly
PositiveArtificial Intelligence

Human-centric photo dataset aims to help spot AI biases responsibly
This week, a groundbreaking database featuring over 10,000 human images has been introduced in Nature, aimed at addressing biases in artificial intelligence models. Developed by Sony AI, the Fair Human-Centric Image Benchmark (FHIBE) is ethically sourced and consent-based, making it a vital tool for evaluating human-centric computer vision tasks. This initiative is significant as it not only helps in identifying and correcting biases and stereotypes in AI but also promotes responsible AI development, ensuring that technology serves all individuals fairly.
— via World Pulse Now AI Editorial System

