It’s the latest legal challenge against the Grok chatbot’s mass creation of nonconsensual sexual imagery of women and girls.
A tip from an anonymous Discord user led cops to find what may be the first confirmed Grok-generated child sexual abuse materials (CSAM) that Elon Musk’s xAI can’t easily dismiss as nonexistent. As ...
Three teenage plaintiffs in a lawsuit filed Monday accuse xAI of distributing, possessing and producing with intent to distribute child pornography.