TL;DR
In a 30-page essay, researcher Sara Hooker argues the past decade's fixation on enlarging models and datasets has reshaped AI research and funding, concentrating power in industry labs and sidelining academia. The paper asserts that the link between compute and model performance is becoming less predictable and that relying solely on scale overlooks other promising paths for progress.
What happened
Sara Hooker published a 30-page essay, first posted December 12, 2025 and revised January 6, 2026, that challenges the prevailing strategy of pursuing progress by only increasing model size and training data. The paper traces how a simple scaling prescription came to dominate investment and research priorities, producing large capital flows into industry labs and altering norms around scientific practice. Hooker contends this focus has marginalized academic participation and coincided with a decline in public dissemination from corporate labs. Central to the essay is the claim that the relationship between training compute and performance is increasingly uncertain and evolving, and that an exclusive emphasis on scale neglects other levers that could drive future advances. The author warns that important disruptions may be imminent if the community continues to treat scaling as the default path forward.
Why it matters
- Concentration of funding and resources around scaling can reshape who leads AI research and what gets studied.
- If the compute–performance relationship is changing, investment strategies tied solely to scale risk becoming inefficient.
- Reduced public disclosure from industry labs could hinder reproducibility and broader scientific progress.
- Ignoring alternative routes to improvement may delay or misdirect future breakthroughs.
Key facts
- Author: Sara Hooker.
- Paper length: 30 pages; originally written December 6, 2025.
- SSRN posting date: December 12, 2025; last revised January 6, 2026.
- Downloads reported on SSRN: 2,729; abstract views: 5,771; SSRN rank: 12,909.
- The essay locates a dominant prescription for innovation over the last decade: increase model size and training data.
- Hooker argues that this scaling-first mindset shifted capital toward industry labs and reshaped scientific culture.
- The paper asserts academia has been marginalized and that many industry labs have curtailed publishing.
- Core claim: the mapping from training compute to performance is uncertain and changing, making scale a less reliable guide.
- Hooker warns the community is overlooking other levers of progress and that this oversight could produce major disruptions.
- The manuscript cites 155 references and lists keywords including scaling, deep neural networks, efficiency, and scientific progress.
What to watch next
- not confirmed in the source: whether industry labs will resume more open publishing or continue to restrict disclosures.
- not confirmed in the source: shifts in funding priorities away from pure scaling toward efficiency or alternative methods.
- not confirmed in the source: emergence of concrete technical alternatives to scaling that materially alter performance trajectories.
Quick glossary
- Scaling: The practice of improving model performance by increasing model size, dataset size, or training compute.
- Training compute: The computational resources (such as GPU/TPU hours) consumed to train machine learning models.
- Deep neural networks: A class of machine learning models composed of multiple layers of interconnected units that learn hierarchical representations from data.
- Scientific publishing: The process by which researchers disclose methods and results to the broader community, typically through papers and preprints.
Reader FAQ
Who wrote the paper and when was it published?
Sara Hooker; posted on SSRN December 12, 2025 and revised January 6, 2026.
Does the paper claim scaling no longer works at all?
The essay argues that the relationship between compute and performance is uncertain and changing, and that relying solely on scaling misses other levers of progress.
Does the paper provide empirical evidence for its claims?
not confirmed in the source.
Does the author say what the alternative levers of progress are?
not confirmed in the source.
Download This Paper Open PDF in Browser Add Paper to My Library Share: On the Slow Death of Scaling 30 Pages Posted: 12 Dec 2025 Last revised: 6 Jan 2026…
Sources
Related posts
- Meta’s $2 Billion Manus Deal Draws Uneven Reactions in Washington and Beijing
- McKinsey and General Catalyst: the ‘learn once, work forever’ era is over
- Windows Update Failure Likely Bricked Snapdragon Dev Kit, Owner Says