2 Sources
2 Sources
[1]
Figure robot gets AI brain that enables human-like full-body control
Helix, the AI brain powering Figure's humanoid robots, has been upgraded to its most advanced full-body control system to date. Unlike earlier models limited to upper-body tasks, Helix 02 uses a single neural network to control walking, manipulation, and balance together, directly from raw sensor data. In a key demonstration, the humanoid autonomously unloaded and reloaded a dishwasher across an entire kitchen without resets or human input. According to the Figure, the system replaces complex hand-coded control with learned, human-like motion, opening new levels of dexterity. In February 2025, the California-based firm unveiled Helix, a generalist Vision-Language-Action (VLA) model that combines perception, language, and control to advance robotics. For decades, loco-manipulation -- the ability of a robot to move and manipulate objects as a single continuous behavior -- has remained one of robotics' most difficult challenges. Walking and manipulation work well in isolation, but combining them introduces constant coupling: lifting affects balance, stepping changes reach, and arms and legs continuously constrain one another. While humanoid robots have shown impressive short, scripted feats such as dancing or jumping, most lack true adaptability. Their motions are often planned offline and break down when real-world conditions change. Traditional robotics has addressed this by separating locomotion and manipulation into distinct controllers linked by state machines, resulting in slow, brittle, and unnatural behavior. True autonomy demands a fundamentally different approach -- a unified system that perceives, decides, and acts with the entire body at once. This is the motivation behind Helix 02, a unified whole-body loco-manipulation VLA system. Helix 02 introduces System 0, a new foundation layer added to Figure's existing System 1 and System 2 hierarchy. System 2 handles high-level reasoning and language, System 1 translates perception into full-body motion at high frequency, and System 0 executes human-like balance and coordination at kilohertz rates. According to the Figure, trained on over 1,000 hours of human motion data and large-scale simulation, System 0 replaces hand-engineered control with a learned prior for stable, natural movement. Together, the three systems enable continuous, adaptive whole-body autonomy -- allowing humanoid robots to walk, carry, reach, and recover in real time. Figure claims Helix 02 demonstrates a major step forward in humanoid autonomy by performing continuous, multi-minute tasks that require tight integration of locomotion, dexterity, and sensing. In fully autonomous evaluations, the system completes extended locomotion and manipulation behaviors without teleoperation or resets. A flagship demonstration shows Helix 02 loading and unloading a dishwasher across a full-sized kitchen during a four-minute, end-to-end task -- the most complex autonomous manipulation sequence shown to date and the first long-horizon "pixels-to-whole-body" control on a humanoid robot. The task highlights several capabilities: walking while maintaining delicate grasps, using the entire body to interact with the environment, and coordinating both arms throughout complex object transfers and placement. The same neural network controls motions ranging from millimeter-scale finger movements to room-scale locomotion, sequencing more than 60 actions with implicit error recovery over minutes of execution. Helix 02 also advances dexterous manipulation through tactile sensing and palm-mounted cameras. In autonomous tests, the robot unscrews bottle caps, extracts individual pills from organizers despite occlusion, dispenses precise syringe volumes under variable resistance, and selects small metal parts from cluttered containers. According to the robotic firm, these results, taken together, show how Helix 02 combines full-body control, touch, and in-hand vision to achieve continuous, adaptive autonomy across complex, real-world tasks. "The results are early - but they already show what continuous, whole-body autonomy makes possible. A 4-minute autonomous task with 61 fluidly executed loco-manipulation actions, dexterous behaviors enabled by tactile sensing and palm cameras, and whole-body coordination that uses hips and feet alongside hands and arms," said Figure, in a statement.
[2]
Figure AI unveils Helix 02 with full-body robot autonomy By Investing.com
Investing.com -- Figure AI has introduced Helix 02, a new humanoid robot model that controls the entire robot body directly from visual input, enabling seamless integration of walking, manipulation, and balance. The company describes Helix 02 as its most capable humanoid model to date, featuring a single neural system that powers "dexterous, long horizon autonomy" throughout an entire room. A key demonstration shows the robot autonomously unloading and reloading a dishwasher across a full-sized kitchen - a four-minute task completed without human intervention. Figure AI claims this represents "the longest horizon, most complex task completed autonomously by a humanoid robot to date." Helix 02 connects all onboard sensors - including vision, touch, and proprioception - directly to every actuator through a unified neural network. The system is powered by "System 0," a learned whole-body controller trained on over 1,000 hours of human motion data and simulation-to-real reinforcement learning. The company states that System 0 replaces 109,504 lines of hand-engineered C++ code with a single neural system for stable, natural motion. With Figure 03's embedded tactile sensing and palm cameras, Helix 02 can perform previously challenging manipulations such as extracting individual pills, dispensing precise syringe volumes, and handling small, irregular objects from cluttered environments despite self-occlusion. Figure AI explains that loco-manipulation - the ability for a robot to move and manipulate objects as a single, continuous behavior - has been one of robotics' most difficult unsolved problems because the actions constrain each other continuously. Traditional robotics has worked around this by separating locomotion and manipulation into distinct controllers connected with state machines, but Figure AI argues that true autonomy requires a single learning system that reasons over the whole body simultaneously. Helix 02 extends the company's "System 1, System 2" architecture with the new System 0 foundation layer. Each system operates at different timescales: System 2 reasons about goals, System 1 translates perception into joint targets at 200 Hz, and System 0 executes at 1 kHz to handle balance, contact, and coordination. The company demonstrated Helix 02 performing various tasks, including unscrewing bottle caps, locating and extracting pills from medicine boxes, dispensing precise volumes from syringes, and picking metal pieces from cluttered boxes. This article was generated with the support of AI and reviewed by an editor. For more information see our T&C.
Share
Share
Copy Link
Figure AI has introduced Helix 02, its most advanced humanoid robot model featuring human-like full-body control through a single neural network. The system autonomously completed a four-minute dishwasher task across a full kitchen—the longest and most complex autonomous humanoid demonstration to date. Trained on over 1,000 hours of human motion data, Helix 02 integrates walking, manipulation, and balance without human intervention.
Figure AI has unveiled Helix 02, an advanced humanoid robot model that achieves human-like full-body control through a unified approach to robotics
1
. Unlike earlier models limited to upper-body tasks, Helix 02 uses a single neural network to control walking, manipulation, and balance together, directly from raw sensor data1
. The California-based firm first introduced Helix in February 2025 as a generalist Vision-Language-Action model, but this latest iteration marks a significant leap in achieving continuous, adaptive whole-body autonomy1
.In a flagship demonstration, the humanoid robot was able to autonomously unload and reload a dishwasher across an entire kitchen without resets or human input
1
. This four-minute, end-to-end task represents the longest horizon and most complex task completed autonomously by a humanoid robot to date, according to Figure AI2
. The same neural network controlled motions ranging from millimeter-scale finger movements to room-scale locomotion, sequencing more than 60 actions with implicit error recovery over minutes of execution1
. The task highlighted several capabilities including walking while maintaining delicate grasps, using the entire body to interact with the environment, and coordinating both arms throughout complex object transfers and placement1
.
Source: Interesting Engineering
For decades, loco-manipulation—the ability of a robot to move and manipulate objects as a single continuous behavior—has remained one of robotics' most difficult challenges
1
. Traditional robotics has addressed this by separating locomotion and manipulation into distinct controllers linked by state machines, resulting in slow, brittle, and unnatural behavior1
. Helix 02 introduces System 0, a new foundation layer that serves as a learned whole-body controller trained on over 1,000 hours of human motion data and large-scale simulation using reinforcement learning1
2
. System 0 replaces 109,504 lines of hand-engineered C++ code with a single neural system for stable, natural motion2
.Helix 02 extends Figure AI's existing architecture with three distinct systems operating at different timescales
2
. System 2 handles high-level reasoning and language about goals, System 1 translates perception into full-body motion at 200 Hz, and System 0 executes human-like balance and coordination at kilohertz rates—specifically 1 kHz to handle balance, contact, and coordination1
2
. Together, the three systems enable continuous, adaptive whole-body autonomy that allows humanoid robots to walk, carry, reach, and recover in real time1
.Related Stories
Helix 02 also advances dexterous manipulation through tactile sensing and palm cameras integrated into Figure 03's hardware
1
2
. The system connects all onboard sensors—including vision, touch, and proprioception—directly to every actuator through a unified neural network2
. In autonomous tests, the robot demonstrated the ability to unscrew bottle caps, extract individual pills from organizers despite occlusion, dispense precise syringe volumes under variable resistance, and select small metal parts from cluttered containers1
2
. These capabilities showcase how integrated walking, manipulation, and balance work together to achieve continuous autonomy across complex, real-world tasks1
.Figure AI's approach represents a shift from scripted demonstrations to true adaptability in humanoid robotics
1
. While many humanoid robots have shown impressive short, scripted feats such as dancing or jumping, most lack true adaptability, with motions planned offline that break down when real-world conditions change1
. By replacing complex hand-coded control with learned, human-like motion, Helix 02 opens new levels of dexterity and practical application1
. "The results are early - but they already show what continuous, whole-body autonomy makes possible. A 4-minute autonomous task with 61 fluidly executed loco-manipulation actions, dexterous behaviors enabled by tactile sensing and palm cameras, and whole-body coordination that uses hips and feet alongside hands and arms," Figure AI stated1
. This development matters because it demonstrates how AI and robotics can converge to create systems capable of performing extended, multi-minute tasks requiring tight integration of locomotion, dexterity, and sensing—capabilities essential for deployment in homes, healthcare facilities, and commercial environments where error recovery and adaptability are critical.Summarized by
Navi
[1]
1
Technology

2
Policy and Regulation

3
Policy and Regulation
