TL;DR
A 9to5Mac opinion argues Apple should stop investing in its own large language models and instead run top external models behind Apple’s Private Cloud Compute to speed up Siri improvements while preserving user privacy. The author says allowing users temporary access to third-party assistants and collecting opt-in request data could accelerate development without compromising Apple’s privacy stance.
What happened
Apple has faced criticism for being behind in generative AI and for an Apple Intelligence rollout that the company acknowledged overpromised. An opinion piece by Ben Lovejoy traces his change of view: he once defended Apple’s slow, cautious approach on safety and privacy grounds but now argues the company should abandon building its own LLMs. The shift follows reporting that some Apple executives view large language models as future commodities and that Apple may plan to use Google’s Gemini for many Siri queries. That report says the Gemini-based models would be hosted on Apple’s Private Cloud Compute (PCC) servers, with Apple handling on-device data retrieval for personal requests. Lovejoy also recommends Apple let users choose alternative chat assistants temporarily and, with permission, gather anonymized request data to guide Siri’s development.
Why it matters
- Shifting away from in‑house model development could let Apple adopt stronger AI capabilities faster without compromising its privacy model.
- Running third‑party models on Apple’s PCC aims to combine leading AI performance with Apple-controlled infrastructure and data handling.
- Allowing users to pick alternative assistants could relieve product pressure and produce useful usage data—if obtained with consent.
- The debate highlights tensions between safety, privacy commitments and competitive pressure in the consumer AI market.
Key facts
- The piece is an opinion by Ben Lovejoy published on 9to5Mac.
- Apple faced criticism for its Apple Intelligence launch and admitted it had promised features it could not deliver.
- Lovejoy initially defended Apple’s cautious approach because of safety incidents with early chatbots and Apple’s privacy stance.
- A recent report cited in the piece indicates some Apple leaders think LLMs will become commodities and question building proprietary models.
- That report suggested Apple might use Google’s Gemini as a backend for many Siri queries.
- The Gemini-based model would reportedly run on Apple’s Private Cloud Compute (PCC) servers, according to the reporting cited.
- The article recommends Apple permit users to use third-party assistants temporarily and to collect opt‑in request data to inform Siri’s development.
- The author says he currently has Siri configured to fall back to ChatGPT in cases where Siri cannot help, relying on Apple’s assurances about data use.
What to watch next
- Whether Apple formally abandons in‑house LLM development and announces a new model strategy (not confirmed in the source).
- If Apple will confirm plans to run Google’s Gemini or other third‑party models on its Private Cloud Compute servers (not confirmed in the source).
- Announcements on user controls for selecting third‑party assistants and any opt‑in data collection mechanisms to aid Siri development (not confirmed in the source).
Quick glossary
- Large Language Model (LLM): A machine learning model trained on vast amounts of text to generate human‑like language and answer questions.
- Siri: Apple’s voice assistant, built into iOS and other Apple products to perform tasks, answer queries and control device features.
- Google Gemini: A family of AI language models developed by Google; cited in reporting as a potential backend for some Siri queries.
- Private Cloud Compute (PCC): Apple’s internal server infrastructure referenced in reporting as the environment where third‑party models could run under Apple’s control.
Reader FAQ
Is Apple confirmed to be abandoning its own LLM development?
Not confirmed in the source; the article cites a report suggesting some Apple leaders favor that path but no official decision is reported.
Will Siri run on Google’s Gemini?
A report mentioned in the piece says Apple may use Gemini for many Siri queries, but that plan is not confirmed in the source.
Does running external models on Apple servers protect privacy?
The reporting referenced in the article claims models hosted on Apple’s PCC would preserve user privacy; this is presented as the report’s assertion.
Did Apple admit to overpromising on Apple Intelligence?
Yes. The article notes Apple acknowledged promising features it could not deliver during the Apple Intelligence launch.

Apple Card shake-up may finally happen this year: Here’s the latest Michael Burkhardt Jan 3 2026 AAPL COMPANY OPINION SIRI APPLE INTELLIGENCE I think Apple should take this radical approach…
Sources
- I think Apple should take this radical approach to the new Siri
- Apple Weighs Using Anthropic or OpenAI to Power Siri …
- Abandoning In-House Large Models? Apple Shifts to Gemini + …
- Apple and AI: Yet Another Example of Conway's Law
Related posts
- Amazon brings its Alexa+ AI assistant to the web with Alexa.com
- Six Apple products expected in early 2026: what to expect next
- I Reset My Google Account Privacy Settings for 2026 — Toggles I Switched Off