TL;DR
Multiple independent datasets indicate many Single-Page Applications (SPAs) produce very few client-side view transitions per full page load. If average session depth is indeed this low, the core performance trade that justifies SPAs—paying up-front JavaScript costs to speed follow-up interactions—may be valid for far fewer sites than commonly assumed.
What happened
Alex Russell reports that several web-performance datasets, including the RUM Archive and early analysis of Chrome’s Soft Navigations Origin Trial, converge on a surprising result: sites built as SPAs appear to generate only about one soft navigation for every hard page load. That ratio, if representative, undermines the traditional SPA argument that extra up-front JavaScript is amortized across many in-session interactions. Russell frames the problem with a simple session-amortized latency model: average interaction latency equals initial navigation latency plus the sum of subsequent interaction latencies divided by the number of interactions. He sketches how broad adoption of React-style toolchains shipped large amounts of client-side code into the critical path, and how progressive enhancement once limited that exposure. Platform gaps (notably long-standing restrictions from one major vendor) pushed many developers toward heavier client-side architectures. The article calls out two open questions: whether the distribution of session depths invalidates the average, and whether measurement blind spots hide a substantial number of in-page updates.
Why it matters
- Development teams routinely choose SPA toolchains; if sessions are shallow, their up-front costs may not be justified for many sites.
- Large client-side bundles increase initial load latency for all users, and regressions can be introduced by any team member's changes.
- Performance variance and long tails (P75+) impact accessibility, reliability, and equity; mistaken architecture choices worsen these outcomes.
- If common analytics miss many in-page updates, current metrics could be misleading, affecting product decisions and optimization priorities.
Key facts
- Multiple independent sources—the RUM Archive and Chrome Soft Navigations Origin Trial analysis—report low soft-nav to hard-load ratios for SPAs.
- The observed average suggests roughly one soft navigation per full page load on many SPA-built sites.
- The SPA trade-off: more up-front JavaScript to make follow-up interactions feel faster; session depth (N) determines the amortized benefit.
- Russell presents a session-weighted latency formula to reason about the costs and benefits of client-side routing and interactions.
- Widespread adoption of React-oriented toolchains has led to large critical-path bundles being sent to clients by default.
- Progressive enhancement previously kept server-side UI logic primary and layered JavaScript incrementally; that approach is less common now.
- Persistent platform gaps—cited as long-term constraints from a major browser vendor—have driven some application classes toward heavy client-side code.
- Specialized remediation roles and teams have emerged to deal with performance regressions introduced by SPA architectures.
What to watch next
- Further analysis of session-depth distributions to determine whether a low average hides a long-tail of deep sessions.
- Efforts to quantify how many in-page, server-synchronised view updates are not captured by soft-navigation metrics.
- The practical impacts of features like Service Workers, Multi-Page View Transitions, and Speculation Rules on the SPA trade-off.
Quick glossary
- Single-Page Application (SPA): A web app architecture that loads a single HTML page and dynamically updates the view using client-side JavaScript without full page reloads.
- Soft navigation: A client-side view transition (route change) handled without a full, hard page reload; often tracked to measure SPA interactions.
- Hard navigation: A full page load initiated by the browser, typically involving fetching a new document from the server.
- Progressive enhancement: A development approach that builds core functionality on server-rendered HTML and layers JavaScript enhancements on top.
- RUM (Real User Monitoring): A method of collecting performance data from actual user sessions in production to understand real-world behavior.
Reader FAQ
Do the datasets prove SPAs are a bad choice for all sites?
Not confirmed in the source; the article argues that if average session depth is low, many SPA trade-offs may not pay off, but it also notes distributional uncertainties.
Which datasets are reporting the shallow-session signal?
The RUM Archive and early analysis of Chrome’s Soft Navigations Origin Trial are cited as agreeing on the low soft-nav counts.
Could analytics be missing interactions that would change the conclusion?
The article raises that possibility: some in-page updates may not be tracked as soft navigations and could represent ‘dark matter’ in measurements.
Should teams stop using SPA frameworks immediately?
Not confirmed in the source; the piece suggests the decision should be re-evaluated in light of session-depth data and the specific application’s interaction patterns.

The Curious Case of the Shallow Session SPAs 31stDec 2025 by Alex Russell ABOUT THE AUTHOR Alex Russell (@slightlylate) is Partner Program Architect on the Microsoft Edge team and Blink…
Sources
- The Curious Case of the Shallow Session SPAs
- Invalid .internal.selfref detected and fixed by taking a …
- The Death of Single Page Applications (SPA)
- If Not React, Then What?
Related posts
- RESIDUALS: Differential feature detection in LiDAR DEMs via decomposition
- Balancing code-driven and LLM-driven internal agents for reliable workflows
- How Children Recast Our Sense of Time: On Memory, Firsts and Traditions