TL;DR
The author revisits Tony Hoare’s “billion-dollar mistake” label for null pointers and argues that null dereferences are among the easiest invalid-memory errors to catch, while not being the dominant class of memory faults in systems languages. Language design choices—like zero-initialization and optional types—carry trade-offs that can affect performance, ergonomics, and system-level programming patterns.
What happened
A recent essay revisits the long-running critique of null references, originally described by Tony Hoare as the “billion-dollar mistake.” The writer argues the common framing overstates the problem: null is only one kind of invalid address and, in garbage-collected languages, often the most common, but in systems-level languages such as C or Odin many other invalid-address bugs (use-after-free, bad pointer arithmetic, unmapped pages) are at least as important. The piece explains that dereferencing null is a symptom—causing runtime panics—rather than the root cause, and that easy-seeming fixes introduce trade-offs. Odin’s designer chose to keep nil pointers and rely on implicit zero-initialization and other language features rather than force explicit per-element initialization or make non-nil the default. The article also notes Odin provides Maybe(^T) akin to Rust’s Option and that clearer array types reduce some C-style pointer errors.
Why it matters
- Language defaults (nil allowed vs non-nil) shape programming patterns and architectural choices across codebases.
- Fixes that eliminate null by design can impose costs in initialization time or programmer ergonomics, especially in systems programming.
- Null dereferences are often easy to detect at runtime, so addressing them does not necessarily eliminate deeper memory-safety classes like use-after-free.
- Choices about pointer semantics affect interoperability with low-level features such as custom allocators and manual memory management.
Key facts
- Tony Hoare coined the phrase “billion-dollar mistake” to describe inventing the null reference.
- The author contends that the billion-dollar figure is likely hyperbolic and not an empirical estimate.
- Null is conventionally represented as zero and used as a sentinel for unset pointers; modern platforms often reserve low virtual memory pages to help catch null dereferences.
- In garbage-collected languages null is commonly the easiest invalid address to reach; in systems languages other invalid addresses are also common.
- Dereferencing null typically produces a runtime panic; the author treats that as a symptom rather than the fundamental problem.
- Odin retains nil pointers by design and zero-initializes variables by default instead of forcing explicit individual-element initialization.
- Odin includes Maybe(^T), semantically similar to Rust’s Option<&T>, but the author says it’s infrequently required in typical Odin code.
- The author argues C’s conflation of pointers and arrays contributes to many NULL-related bugs and that Odin uses proper bounds-checked array types to address this.
- Removing nil by default would force either ubiquitous runtime checks or a programming model requiring explicit initialization everywhere, both of which have trade-offs.
What to watch next
- Whether other systems-level languages change their default pointer initialization or introduce new nullable/non-nullable pointer models — not confirmed in the source
- Adoption trends for Odin’s approach (implicit zero-initialization, separate array types, Maybe) and whether they influence other language designs — not confirmed in the source
Quick glossary
- null pointer (nil, NULL, nullptr): A sentinel pointer value that does not reference a valid object; commonly represented as zero in many systems.
- runtime panic: An unrecoverable error that occurs during program execution, often triggered by illegal memory access such as dereferencing a null pointer.
- maybe/option type: A type that explicitly represents a value that may be present or absent, used to avoid implicit nulls and make absence explicit in code.
- zero-initialization: A language or runtime convention where newly allocated memory is filled with zeros by default, often making a type’s ‘zero value’ predictable.
- use-after-free: A class of memory error that occurs when code accesses memory after it has been released, leading to undefined behavior.
Reader FAQ
Who called null references the “billion-dollar mistake”?
Tony Hoare coined that phrase to describe introducing the null reference; the article cites his statement.
Does the piece claim null pointers are the biggest memory-safety problem?
No. It argues null dereferences are easy to find but not necessarily the most common or serious invalid-memory class in systems languages.
Did Odin remove nil pointers?
No. Odin retains nil pointers and uses implicit zero-initialization; it also offers Maybe(^T) but the language designer prefers keeping nil for systems programming flexibility.
Is the ‘billion-dollar’ estimate supported empirically in the article?
The author treats that figure as hyperbole and says a concrete empirical estimate is not provided.
Was it really a Billion Dollar Mistake? 2026-01-02 TL;DR null pointer dereferences are empirically the easiest class of invalid memory addresses to catch at runtime, and are the least common…
Sources
- Was It Really a Billion Dollar Mistake?
- Null References: The Billion Dollar Mistake
- Null Reference — A billion dollar mistake
- Was it really a Billion Dollar Mistake?
Related posts
- FreeBSD Home NAS, Part 3: WireGuard VPN, Linux Peers & Routing
- JavaScript engines zoo: Side‑by‑side comparison of every JS engine
- Researchers Connect Descriptive Set Theory’s Infinity Problems to Algorithms