4 Sources
4 Sources
[1]
The world's smallest AI supercomputer is the size of a power bank
CES 2026 Read and watch our complete CES coverage here Updated less than 4 minutes ago Tiiny AI has unveiled what Guinness World Records has verified as the world's smallest personal AI supercomputer. It is called the Tiiny AI Pocket Lab, and despite being about the size of a power bank, it promises performance levels that normally require very expensive hardware. Other small supercomputers such as NVIDIA's Project Digits, priced around $3,000, and the DGX Spark, which comes for $4,000, sit at price points that put them out of reach for most everyday users. Tiiny AI argues that today's real AI bottleneck is not computing power but our reliance on the cloud. GTM director Samar Bhoj says, "intelligence shouldn't belong to data centers, but to people." By running large models locally, the Pocket Lab aims to reduce cloud dependency, improve privacy, and make advanced AI feel personal rather than remote. The tech that powers Tiiny AI's supercomputer The Pocket Lab measures just 14.2 Γ 8 Γ 2.53 cm and weighs only 300 grams, yet the company says it can deploy large language models with up to 120 billion parameters. Models of this size are normally associated with server racks or professional GPUs, but Tiiny AI wants to bring that capability to a device that fits in your hand. The Pocket Lab is built on the newest ARM v9.2 12-cores CPU and supports popular open-source models such as GPT-OSS, Llama, Qwen, DeepSeek, Mistral, and Phi. At the heart of the device is a discrete neural processing unit capable of delivering 190 TOPS. It also includes 80 gigabytes of LPDDR5X memory, which allows for aggressive quantization so massive models can run locally without depending on cloud infrastructure. Tiiny AI has also built two key technologies into the system. TurboSparse is a neuron-level sparse activation method that improves inference efficiency without reducing model intelligence. PowerInfer is a heterogeneous inference engine that splits AI workloads across the CPU and NPU, giving server-grade performance while keeping power demands low. This combination makes the Pocket Lab a compelling option for anyone experimenting with local AI, whether for research, robotics, or advanced reasoning tasks. Recommended Videos Tiiny AI plans to showcase the device at CES 2026. Pricing and release details are still under wraps, but the industry will be watching closely to see how a supercomputer this small performs once it reaches real users.
[2]
Meet the world's smallest AI supercomputer that fits in your pocket
Tiiny AI unveiled the Tiiny AI Pocket Lab, verified by Guinness World Records as the world's smallest personal AI supercomputer. This device, comparable in size to a power bank, enables local execution of large AI models to address cloud dependency issues, enhancing privacy and accessibility for individual users. The Pocket Lab measures 14.2 by 8 by 2.53 centimeters and weighs 300 grams, allowing it to fit easily into a pocket or bag. Tiiny AI positions this supercomputer as an option for everyday users, contrasting with other compact devices like NVIDIA's Project Digits, which costs approximately $3,000, and the DGX Spark, priced at $4,000. These higher price points limit their availability to broader audiences, while the Pocket Lab targets more widespread adoption. Tiiny AI identifies reliance on cloud infrastructure as the primary limitation in current AI systems, rather than insufficient computing power. The company emphasizes local processing to make AI more personal. GTM director Samar Bhoj states, "intelligence shouldn't belong to data centers, but to people." This approach reduces dependence on remote servers, strengthens data privacy by keeping computations on-device, and delivers AI capabilities directly to users without network requirements. The device supports deployment of large language models reaching up to 120 billion parameters, a scale typically requiring extensive server racks or specialized professional graphics processing units. Such models demand significant resources, yet the Pocket Lab handles them through optimized hardware and software integrations, bringing high-level AI performance to a portable form factor. At its foundation, the Pocket Lab uses the ARM v9.2 architecture with a 12-core central processing unit. It accommodates various open-source models, including GPT-OSS, Llama, Qwen, DeepSeek, Mistral, and Phi. These models serve diverse applications, from natural language processing to specialized tasks, and the device's compatibility expands possibilities for developers and researchers working offline. Central to its operation is a discrete neural processing unit that achieves 190 tera operations per second. This unit pairs with 80 gigabytes of LPDDR5X memory, facilitating aggressive quantization techniques. Quantization compresses model data to lower precision levels, enabling efficient local inference on constrained hardware while preserving essential computational accuracy. Tiiny AI incorporates two proprietary technologies to enhance performance. TurboSparse employs neuron-level sparse activation, selectively deactivating less critical neural pathways during inference. This method increases efficiency by reducing unnecessary computations, maintaining full model intelligence without quality degradation. PowerInfer functions as a heterogeneous inference engine, distributing AI workloads dynamically between the central processing unit and neural processing unit. This division optimizes resource use, delivering performance comparable to server environments at reduced power consumption levels suitable for portable devices. The combination of these elements supports applications in research, robotics, and advanced reasoning tasks. Tiiny AI intends to demonstrate the Pocket Lab at CES 2026. Details on pricing and availability remain undisclosed at this stage.
[3]
This tiny personal AI supercomputer can run 120B AI models while fitting in your hand
TL;DR: Tiiny AI's Pocket Lab, the world's smallest personal AI supercomputer verified by Guinness World Records, runs 120-billion-parameter LLMs fully on-device without cloud or internet. It offers energy-efficient, private AI computing with 80GB RAM, 1TB SSD, and advanced encryption, enabling secure, offline AI for developers and professionals. US deep-tech AI startup Tiiny AI has just unveiled the world's smallest personal AI supercomputer, with the new Tiiny AI Pocket Lab, which has been officially verified by the Guinness World Record under "The Smallest MiniPC (100B LLM Locally)". This is the first global unveiling of the new Tiiny AI Pocket Lab, which will fit in your hands -- or your pocket, duh -- and is capable of running up to a full 120-billion-parameter LLM (Large Language Model) entirely on-device, without the need of cloud connectivity, servers, or high-end GPUs. Tiiny has developed its super-small AI supercomputer for energy-efficient personal intelligence, and the Tiiny AI Pocket Lab runs within a 65W power envelope. The new Tiiny AI Pocket Lab enables massive AI model performance at a fraction of the energy and carbon footprint of traditional GPU-based systems. Tiiny AI Pocket Lab is designed to support nearly all major personal AI use cases, serving developers, researchers, creators, professionals, and students. It enables multi-step reasoning, deep context understanding, agent workflows, content generation, and secure processing of sensitive information - even without internet access. The tiny AI supercomputer also provides true long-term personal memory by storing user data, preferences, and documents locally with bank-level encryption, offering a level of privacy and persistence that cloud-based AI systems cannot provide. Samar Bhoj, GTM Director of Tiiny AI, said: "Cloud AI has brought remarkable progress, but it also created dependency, vulnerability, and sustainability challenges. With Tiiny AI Pocket Lab, we believe intelligence shouldn't belong to data centers, but to people. This is the first step toward making advanced AI truly accessible, private, and personal, by bringing the power of large models from the cloud to every individual device". Tiiny AI supercomputer key specifications:
[4]
Meet the World's Smallest 'Supercomputer' from Tiiny AI; A Machine Bold Enough to Run 120B AI Models Right in the Palm of Your Hand
Compact AI devices have become increasingly mainstream, but a new startup has broken the barriers by introducing the world's smallest AI supercomputer, which appears highly capable on paper. Edge AI has become an emerging segment of the computing industry, primarily because deploying open-source models on local machines allows for a more personalized workload. However, it also requires expensive hardware. Devices like NVIDIA's DGX Spark can cost up to $4,000, which isn't feasible for a general consumer. A startup called Tiiny AI plans to bridge this gap, not only by introducing a cost-effective solution, but also by introducing a device that is claimed to be the 'world's smallest' supercomputer, called the Tiiny AI Pocket Lab. Interestingly, the device measures just 14.2 Γ 8 Γ 2.53 cm, weighing 300g, yet Tiiny AI claims that the supercomputer can successfully deploy a 120-billion-parameter model, a one-of-a-kind achievement. LLMs usable with this machine are said to be perfect for "PhD-level reasoning, multi-step analysis, and deep contextual understanding." With on-device capabilities, the AI Pocket Lab is ideal not only for consumers but also for those seeking to experiment with local LLM deployment. Based on what Tiiny AI has disclosed, the AI Pocket Lab supports models from GPT-OSS, Llama, Qwen, DeepSeek, Mistral, and Phi. One of the most impressive aspects of the AI Pocket Lab is that it can deliver 190 TOPS with a discrete NPU onboard. With 80 GB of LPDDR5X RAM onboard, you can enable aggressive quantization, allowing a 120B model to run seamlessly in a local environment. Moreover, Tiiny AI says that the firm has employed two techniques that make a 120B interface practical, and here they are: TurboSparse, a neuron-level sparse activation technique, significantly improves inference efficiency while maintaining full model intelligence. PowerInfer, an open-source heterogeneous inference engine with more than 8,000 GitHub stars, accelerates heavy LLM workloads dynamically distributing computation across CPU and NPU, enabling sever-grade performance at a fraction of traditional power consumption. Together, these technologies allow Tiiny AI Pocket Lab to deliver capabilities that previously required professional GPUs costing thousands of dollars. The device is set to be showcased at CES 2026. Although the firm hasn't disclosed details about the release date and retail availability, the AI Pocket Lab certainly appears to be a promising device. It will be interesting to see how its industry debut turns out.
Share
Share
Copy Link
US startup Tiiny AI has introduced the Pocket Lab, verified by Guinness World Records as the world's smallest personal AI supercomputer. Despite measuring just 14.2 Γ 8 Γ 2.53 cm and weighing 300 grams, the device can run large language models with up to 120 billion parameters entirely on-device. The company aims to reduce cloud dependency and enhance privacy by bringing server-grade AI capabilities to a portable device that fits in your hand.
US deep-tech startup Tiiny AI has unveiled the Pocket Lab, officially verified by Guinness World Records as the world's smallest personal AI supercomputer.
1
The device measures just 14.2 Γ 8 Γ 2.53 cm and weighs only 300 grams, making it comparable in size to a power bank, yet it promises to run large language models with up to 120 billion parameters entirely on-device.2
This marks a shift in how advanced AI computing could reach individual users, moving capabilities that typically require expensive server racks or professional GPUs into something that fits in your pocket.
Source: Wccftech
Tiiny AI argues that the real bottleneck in AI today isn't computing power but our reliance on cloud infrastructure. GTM director Samar Bhoj states, "intelligence shouldn't belong to data centers, but to people."
1
By enabling local execution of AI models, the Pocket Lab aims to reduce cloud dependency while simultaneously working to enhance privacy by keeping computations and user data on-device rather than transmitting them to remote servers.2
This approach addresses growing concerns about data vulnerability and the sustainability challenges associated with centralized cloud AI systems.3
The AI supercomputer is built on the ARM v9.2 architecture with a 12-core CPU and supports popular open-source models including GPT-OSS, Llama, Qwen, DeepSeek, Mistral, and Phi.
1
At its core sits a discrete Neural Processing Unit (NPU) capable of delivering 190 TOPS, paired with 80 gigabytes of LPDDR5X memory.2
This substantial memory allocation facilitates aggressive quantization techniques that compress model data to lower precision levels, enabling massive models to run 120B AI models locally without sacrificing essential computational accuracy.4

Source: TweakTown
The device operates within a 65W power envelope, delivering performance at a fraction of the energy and carbon footprint of traditional GPU-based systems.
3
For context, competing compact devices like NVIDIA's Project Digits cost approximately $3,000, while the DGX Spark comes in at $4,000, putting them out of reach for most everyday users.1
Tiiny AI positions the Pocket Lab as a more accessible alternative that democratizes access to Edge AI capabilities.Tiiny AI has integrated two proprietary technologies that make running massive models practical on such compact hardware. TurboSparse employs neuron-level sparse activation, selectively deactivating less critical neural pathways during inference to increase efficiency without reducing model intelligence. PowerInfer, an open-source heterogeneous inference engine with over 8,000 GitHub stars, dynamically distributes AI workloads across the CPU and NPU.
4
This split-processing approach delivers server-grade performance while keeping power consumption low enough for portable operation.The combination of these technologies enables capabilities previously reserved for professional GPUs costing thousands of dollars.
4
The device supports multi-step reasoning, deep context understanding, agent workflows, content generation, and secure processing of sensitive information, even without internet access.3
It also provides true long-term personal memory by storing user data, preferences, and documents locally with bank-level encryption, offering persistence that cloud-based AI systems cannot match.Related Stories
Tiiny AI plans to showcase the Pocket Lab at CES 2026, though pricing and release details remain undisclosed.
1
The device targets developers, researchers, creators, professionals, and students who need powerful AI capabilities without cloud infrastructure dependence.3
Industry observers will watch closely to see whether the Pocket Lab can deliver on its promises when it reaches real users, particularly regarding thermal management and sustained performance in such a compact form factor. The success of this personal AI supercomputer could signal a broader shift toward decentralized AI computing, where individuals gain direct control over their intelligence tools rather than relying on data center infrastructure.Summarized by
Navi
[1]
1
Technology

2
Technology

3
Policy and Regulation
