TL;DR
Hardware maker Odinn showed Omnia at CES: a 35 kg enclosure that packs AMD EPYC CPUs, up to four Nvidia H200 NVL GPUs, and 6 TB of RAM into a single unit with carry handles. The firm pitches the box as a portable datacenter that can work alone or be clustered into larger GPU racks; pricing and commercial details were not disclosed.
What happened
At CES, Odinn revealed the Omnia system, a single-enclosure server described as roughly the size of a carry-on but weighing about 35 kg (77 pounds). The chassis houses AMD EPYC 9965 processors, provision for up to four Nvidia H200 NVL GPUs, and as much as 6 TB of memory. The unit includes a 23.8-inch display, a flip-down keyboard, built-in cooling and redundant Platinum-rated power supplies, and carry handles intended to make it transportable. Odinn positions Omnia as a "portable datacenter" for tasks such as edge inferencing, film production, military AI missions and enterprise simulations and offers four configuration profiles: AI, Creator, Search and a high-end Omnia X. The company also says multiple Omnia units can be clustered into so-called Infinity Racks to create larger on-site GPU clusters. Odinn has not published pricing details on its website.
Why it matters
- Provides dense, server-grade GPU compute that can be moved to where workloads run, potentially reducing data transfer latency for on-site inference and production.
- Offers a self-contained option for industries needing powerful, temporary or mobile compute—film crews, remote enterprise sites, or military deployments, per the vendor.
- Packaging high-value GPUs in a portable form raises practical logistics and security questions for transport, custody and theft prevention.
- If clusterable as advertised, units could be combined to scale GPU capacity at remote locations without relying on permanent datacenter facilities.
Key facts
- System name: Omnia (by Odinn).
- Weight: about 35 kg (77 pounds); described as roughly carry-on size.
- Processors: AMD EPYC 9965 CPUs.
- GPUs: up to four Nvidia H200 NVL cards.
- Memory: up to 6 TB of RAM.
- Built-in 23.8-inch display and flip-down keyboard.
- Power and cooling: redundant Platinum-rated PSUs and integrated cooling.
- Four vendor configurations listed: AI, Creator, Search and Omnia X.
- Multiple units can be clustered into "Infinity Racks," according to the company.
- No pricing listed on Odinn's website; a single H200 GPU was estimated in the report to cost about $32,000.
What to watch next
- Pricing, availability and shipment details: not confirmed in the source.
- How customers address transport, insurance and theft risks for high-value portable units: not confirmed in the source.
- Real-world performance and scaling when multiple Omnia units are clustered into Infinity Racks: not confirmed in the source.
Quick glossary
- GPU: Graphics Processing Unit, a processor optimized for parallel computation commonly used for machine learning, graphics and scientific workloads.
- CPU (AMD EPYC): Central Processing Unit; AMD EPYC is a family of server processors designed for data center and high-performance computing tasks.
- Edge inferencing: Running AI model inference close to where data is generated, reducing latency and bandwidth use compared with sending data to centralized datacenters.
- Datacenter: A facility or collection of hardware that provides computing, storage and networking resources to run applications and store data.
Reader FAQ
How many GPUs does the Omnia support?
The system supports up to four Nvidia H200 NVL GPUs.
How much does an Omnia unit cost?
Not confirmed in the source.
Is Odinn pitching this as a carry-on airline item?
The vendor describes the chassis as carry-on sized, but the report notes the 77-pound weight and doubts about airline acceptance; official airline policies are not confirmed in the source.
Can multiple units be combined for greater capacity?
Odinn says Omnia units can be clustered into "Infinity Racks" to build larger GPU clusters.
Who are the intended users?
Odinn lists potential uses including edge inferencing, film and postproduction, military AI missions and enterprise simulations.

SYSTEMS 2 Luggable datacenter: startup straps handles to server with 4 H200 GPUs Who can lift a 77-pound box into the overhead? Dan Robinson Wed 7 Jan 2026 // 16:21 UTC Fancy having an…
Sources
- Luggable datacenter: startup straps handles to server with 4 H200 GPUs
- US startup unveils AI supercomputer OMNIA the size of a …
- ODINN Unveils Carry-On-Sized AI Supercomputer at CES …
- NVIDIA and Partners Build America's AI Infrastructure and …
Related posts
- Quilt’s data-driven fix for heat pumps enables reliable three-zone units
- Dell’s CES 2026 briefing stood out for being refreshingly low on AI talk
- WhatsApp rolls out Member Tags, Text Stickers and Event Reminders