Grok under fire after complaints it undressed minors in photos

Elon Musk’s Grok on Friday said it was scrambling to fix flaws in the artificial intelligence tool after users claimed it turned pictures of children or women into erotic images.“We’ve identified lapses in safeguards and are urgently fixing them,” Grok said in a post on X, formerly Twitter.“CSAM (Child Sexual Abuse Material) is illegal and prohibited.”For all the latest headlines, follow our Google News channel online or via the app.Complaints of abuses began hitting X after an “edit image” button was rolled out on Grok in late December.The button allows users to modify any image on the platform -- with some users deciding to partially or completely remove clothing from women or children in pictures, according to complaints.Grok maker xAI, run by Musk, replied to an AFP query with a terse, automated response that said: “the mainstream media lies.”The Grok chatbot, however, did respond to an X user who queried it on the matter, after they said that a company in the United States could