X updates Grok to prevent the "editing of images of real people in revealing clothing such as bikinis" and geoblocks it for all users "where it's illegal" (Karissa Bell/Engadget)
NegativeArtificial Intelligence

- X has updated its AI tool Grok to prevent the editing of images of real people into revealing clothing, such as bikinis, and has geoblocked this feature in regions where it is illegal. This decision follows significant backlash over the tool's potential to create nonconsensual sexualized imagery.
- This development is crucial for X as it attempts to address ethical concerns and legal scrutiny surrounding Grok, particularly in light of investigations into its role in generating harmful content. By implementing these changes, X aims to mitigate reputational damage and comply with regulatory pressures.
- The broader implications of this update reflect ongoing debates about the ethical use of AI technologies, particularly in relation to gender-based violence and the exploitation of vulnerable individuals. Experts have raised alarms about the potential for AI to perpetuate harmful stereotypes and facilitate abuse, highlighting the need for robust safeguards in the development and deployment of such technologies.
— via World Pulse Now AI Editorial System





