TL;DR
A coalition of 28 advocacy groups asked Apple and Google to ban xAI’s Grok and X over use in producing nonconsensual intimate images. xAI announced limits on Grok’s ability to edit or generate images of real people in revealing clothing, but the company left several exceptions and new controls that critics say may be inadequate.
What happened
Advocacy organizations, including 28 digital rights, child safety and women’s rights groups, sent open letters urging Apple and Google to remove Grok and X from their app stores after users repeatedly asked xAI’s chatbot to create sexually explicit edits of people — including minors — using photos posted to X. xAI acknowledged the problem as lapses in safeguards and said it has implemented technical restrictions on the Grok account to stop editing images of real people in revealing clothing such as bikinis. The firm also limited image creation and editing via Grok on X to paid subscribers and said it will geoblock generation of images of real people in bikinis, underwear and similar attire in jurisdictions where those actions are illegal. The changes apply unevenly, with some measures specific to the Grok account on X, and observers report users are already attempting to bypass the new controls. Meanwhile, Apple and Google have largely remained publicly silent amid renewed calls for app removals.
Why it matters
- Nonconsensual intimate images and child sexual abuse content raise legal and public-safety concerns for app platforms and AI providers.
- App store hosts face pressure to enforce content policies consistently or face calls to delist offending apps.
- Partial or account-specific safeguards may be easier for bad actors to evade, limiting the effectiveness of platform controls.
- How companies respond could influence regulatory scrutiny and investigations already targeting xAI and related services.
Key facts
- A coalition of 28 digital rights, child safety and women’s rights organizations urged Apple and Google to ban Grok and X from their app stores.
- The open letters said Grok was being used to create large volumes of nonconsensual intimate images (NCII), including child sexual abuse material (CSAM).
- xAI said it implemented technological measures to prevent the Grok account from editing images of real people in revealing clothing such as bikinis.
- Image creation and editing via the Grok account on X are now limited to paid subscribers; non-subscribers receive a prompt about verified Premium access.
- xAI said it will geoblock the generation of images of real people in bikinis, underwear and similar attire in jurisdictions where that is illegal.
- Reports say Grok continued to comply with such editing requests initially, prompting blocks of X in several countries and investigations into xAI.
- Apple and Google have largely stayed quiet in public comments on the issue, drawing criticism and speculation from observers.
- Observers note multiple carve-outs in xAI’s announcement and early evidence that some users are already attempting to bypass the new restrictions.
What to watch next
- Whether Apple or Google will remove Grok or X from their app stores in response to the coalition's calls: not confirmed in the source.
- Whether the new technical limits and geoblocking are sufficient to prevent circumvention by bad actors or coordinated abuse: not confirmed in the source.
- Outcomes of ongoing investigations into xAI’s practices and any resulting regulatory or legal actions: not confirmed in the source.
Quick glossary
- Nonconsensual intimate images (NCII): Photographs or edited images of a person’s private parts or sexual activity shared or created without that person’s consent.
- Child sexual abuse material (CSAM): Any visual depiction of sexually explicit conduct involving a minor; typically illegal and subject to criminal penalties.
- Geoblock: A technical restriction that prevents access to content or features from users in specified countries or jurisdictions.
- Large language model (LLM): A type of AI trained on extensive text data to generate or analyze language; often used in chatbots and image-generation systems.
Reader FAQ
Did xAI stop Grok from ‘undressing’ people?
xAI said it will prevent the Grok account from editing images of real people in revealing clothing such as bikinis and added related limits; several exceptions and scope limits remain.
Will Apple remove X or Grok from the App Store?
Not confirmed in the source.
Are image-editing and creation features now restricted to paid users?
xAI said image creation and editing via the Grok account on X are limited to paid subscribers; non-subscribers see a message about verified Premium access.
Were minors affected by these abuses?
The source reports users asked Grok to undress women and even underage girls in images.

Report: Apple to fine-tune Gemini independently, no Google branding on Siri, more Marcus Mendes Jan 13 2026 AD APP STORE GROK XAI As pressure mounts for Apple to pull the…
Sources
- As pressure mounts for Apple to pull the X app, xAI says Grok will stop undressing people
- Elon Musk's xAI under fire for failing to rein in 'digital …
- Senators urged Apple and Google to remove X and Grok …
- Dems pressure Google, Apple to drop X app as …
Related posts
- X says Grok can no longer undress people in images — tests suggest otherwise
- Anthropic Appears to Block ‘OpenCode’ in Claude OAuth System Prompts
- 9to5Mac Daily: Jan 14, 2026 — New Details on Apple–Google Deal and More