TL;DR

A coalition of 28 digital rights groups led by UltraViolet has asked Apple and Google to pull X and its Grok AI from their app stores, accusing the platform of enabling non-consensual intimate images and child sexual abuse material. The calls come as UK regulator Ofcom continues a probe under the Online Safety Act and X tightens controls on Grok’s image-editing features.

What happened

A group of 28 advocacy organizations, coordinated by UltraViolet, sent nearly identical letters to Apple CEO Tim Cook and Google CEO Sundar Pichai demanding removal of X and the Grok AI from their app stores. The campaign, named "Get Grok Gone," alleges that allowing the apps to remain available means the companies are facilitating and profiting from non-consensual intimate images (NCII) and child sexual abuse material (CSAM) created and circulated using Grok. The letters cite app-store rules that prohibit apps that enable criminal activity or the spread of sexual exploitation material. The campaign arrives while the UK regulator Ofcom maintains a formal investigation under the Online Safety Act into whether misuse of Grok breached legal protections for users. X has already limited Grok’s image-editing to paid accounts, applied geoblocks in some jurisdictions, and said it would stop producing sexualized edits of real people, but advocates say those fixes are inadequate.

Why it matters

  • Raises questions about how app-store platforms enforce policies when apps facilitate harmful AI outputs.
  • Amplifies regulatory scrutiny of AI-driven features and platform responsibilities under laws like the UK's Online Safety Act.
  • Highlights risks to individuals from AI image-editing tools that can generate sexualized or exploitative content.
  • Could set precedents for how major tech companies respond to coordinated civil-society pressure over AI harms.

Key facts

  • A coalition of 28 digital rights organizations, led by UltraViolet, delivered letters to Apple and Google.
  • The campaign is called "Get Grok Gone."
  • Groups accuse Apple and Google of facilitating and profiting from NCII and CSAM by hosting X and Grok.
  • Ofcom is continuing a formal investigation into X under the UK's Online Safety Act.
  • Reporting showed Grok could be manipulated to create sexually explicit edits of real people, with some outputs appearing to involve minors.
  • X initially limited Grok's image-editing features to paid subscribers, then imposed additional geographic restrictions.
  • X stated it would stop producing sexualized edits of real people.
  • Advocates say the platform's changes fall short of preventing further abuse and distribution of illicit content.
  • The Register asked Apple and Google for comment and said it would update the story if responses were received.

What to watch next

  • Whether Apple will remove X and Grok from the App Store — not confirmed in the source.
  • Whether Google will remove X and Grok from the Play Store — not confirmed in the source.
  • Outcome and findings of Ofcom's ongoing investigation under the Online Safety Act — not confirmed in the source.

Quick glossary

  • NCII: Non-consensual intimate images: sexually explicit photos or videos shared or created without the subject's consent.
  • CSAM: Child sexual abuse material: any visual depiction of sexual activity involving minors; illegal and subject to criminal enforcement.
  • App-store policies: Rules set by platform operators (like Apple and Google) that govern what apps and content are permitted in their storefronts.
  • Geoblocking: Restricting access to online content or features based on a user's geographic location.
  • Chatbot image-editing: AI-driven tools that modify or generate images in response to user prompts.

Reader FAQ

Did Apple or Google remove X and Grok after the letters?
Not confirmed in the source.

What did campaigners request from Apple and Google?
They asked both companies to remove X and Grok from their app stores, arguing the apps enable the spread of NCII and CSAM.

Has X changed Grok's capabilities?
X restricted image-editing to paid subscribers, applied geographic blocks in some countries, and said Grok would stop producing sexualized edits of real people.

Is there regulatory action related to this issue?
Yes. Ofcom has an ongoing formal investigation into X under the UK's Online Safety Act.

AI + ML Apple, Google pulled into Grok controversy as campaigners demand app store takedown The chatbot's challenges no longer just Elon Musk’s problem, as campaigners call on tech giants…

Sources

Related posts

By

Leave a Reply

Your email address will not be published. Required fields are marked *