TL;DR
Developer Eric released jax-js, a reimplementation of JAX written in pure JavaScript that compiles numerical programs to WebGPU and WebAssembly kernels to run in the browser. The library exposes a JAX-like API, supports grad/vmap/jit, and is available as an open-source npm package.
What happened
Eric published jax-js, a machine-learning and array library implemented entirely in JavaScript that targets browser-native runtimes. The project traces numerical programs and emits WebGPU and WebAssembly kernels so compute runs outside the JavaScript interpreter; the author says this enables near-native performance for many workloads. jax-js is distributed as a zero-dependency npm package (@jax-js/jax) and offers an API modeled on Google DeepMind’s JAX, with notable JavaScript-specific differences (no operator overloading, explicit .js() conversion, and move-like reference semantics with .ref/.incRef). The library includes primitives such as grad, vmap and jit, examples for in-browser training (including an MNIST demo) and demos such as in-browser CLIP embeddings. The code and resources are published on GitHub (ekzhang/jax-js) with a website, REPL and API reference.
Why it matters
- Moves more numerical and ML workloads into the browser by compiling to WebGPU/Wasm rather than running in the JS interpreter.
- Provides a JAX-like developer experience in JavaScript, lowering friction for web-first ML experimentation.
- Enables interactive workflows (hot reloading, live training) that are harder to achieve when compute runs on remote servers.
- If adopted, it could expand options for client-side inference and prototyping without server-side runtimes.
Key facts
- jax-js is a pure JavaScript reimplementation of ideas from JAX and is open-sourced at ekzhang/jax-js.
- The package has no runtime dependencies and is published as @jax-js/jax on npm.
- The runtime generates WebGPU and WebAssembly kernels and dispatches them to native browser runtimes.
- API mirrors JAX concepts (grad, vmap, jit) but adapts to JavaScript: use ar.mul(10) instead of operator overloading and .js() to extract plain arrays.
- Arrays use move-like semantics with explicit reference management (.ref and .incRef) because JS lacks native destructors.
- Author reports an MNIST browser training demo that achieves >99% accuracy in seconds and a CLIP embedding demo running in-browser.
- Performance claims in the post include ~500 GFLOP/s on an M1 Pro for a text-embedding demo and matmul kernels exceeding 3 TFLOP on a Macbook M4 Pro (as reported by the author).
- Some kernels are noted as unoptimized (for example, conv2d and transformer inference need further work).
- Resources provided: project website, REPL, API reference, and GitHub repository.
What to watch next
- Optimizations for convolution and transformer inference performance (author notes these are currently suboptimal).
- Ecosystem integrations such as model import/workflow compatibility with runtimes like ONNX or other browser runtimes.
- Wider adoption and real-world application performance and security considerations: not confirmed in the source.
Quick glossary
- WebGPU: A web standard that exposes modern GPU capabilities to browser applications for compute and graphics workloads.
- WebAssembly (Wasm): A binary instruction format for a stack-based virtual machine that enables near-native performance for code in web environments.
- JIT (Just-In-Time) compilation: A technique that compiles code at runtime to optimized machine code to improve execution speed.
- Autodiff / grad: Automatic differentiation mechanisms that compute derivatives of functions programmatically to support gradient-based optimization.
Reader FAQ
How do I install jax-js?
Install via npm: npm install @jax-js/jax.
Can jax-js run on the GPU in the browser?
Yes—init with init('webgpu') and setDevice('webgpu') to target WebGPU-enabled browsers.
Is jax-js a drop-in replacement for JAX (Python)?
Not confirmed in the source; it implements many similar primitives and a JAX-like API but has JavaScript-specific differences and some unoptimized kernels.
Where is the source code hosted?
The project is open-source on GitHub at ekzhang/jax-js.

Discover more from eric makes software this is where I write about making software outside of work! systems/pl, interaction design, open-source Subscribe By subscribing, I agree to Substack's Terms of…
Sources
- Show HN: Jax-JS, array library in JavaScript targeting WebGPU
- Show HN: jax-js, an ML library and compiler for the web
- ekzhang/jax-js: JAX in JavaScript – ML library for the web, …
- JAX-JS: The Framework That Just Broke ML Wide Open
Related posts
- Nvidia to speed Siemens chip-design tools by running them on its GPUs
- Stash: An open-source CLI to bidirectionally sync Markdown files with Apple Notes
- Claude Opus 4.5 Delivers an Unusually Capable AI Coding Agent