The Numbers
What Actually Happened
The livestream was triggered by a public challenge. Scott Walter, a well-known robotics commentator, had argued that humanoid robots hold limited commercial value unless they can independently complete a full work shift — defined as eight hours — without human assistance. Figure CEO Brett Adcock responded by simply going live on X and letting the robots prove it.
Three Figure F.03 robots, powered by the company's Helix-02 AI system, took turns in relay rotation on a package sorting line. Their task: detect each package's barcode using onboard cameras, pick up the package, orient it correctly, and send it down the conveyor barcode-face down. They worked at approximately 3 seconds per package — the same throughput as a trained human worker.
The robots also managed their own battery depletion autonomously. When a unit's battery ran low, it signalled for a replacement, left the line, and the next robot took over — without any human coordinating the handoff.
Helix-02 is Figure AI's unified neural network that combines vision, touch, proprioception, and whole-body control into a single learning system. Unlike traditional factory robots that use separate controllers for movement and manipulation, Helix-02 processes all sensory input through one model — allowing the robot to walk, balance, pick up objects, and respond to unexpected changes in its environment without mode-switching or manual intervention. It replaced over 109,000 lines of hand-engineered C++ code with a single learned control system trained on more than 1,000 hours of human motion data.
What the Robots Can Now Do
The 8-hour package sort is the most public demonstration, but it sits on top of a significant capability stack that Figure AI has been building across 2026. The full picture of what Helix-02-equipped F.03 robots can currently do:
- Package sorting at human speed — barcode detection, correct orientation, conveyor placement, 3 seconds per item
- Bedroom reset in under two minutes — two robots coordinating to hang clothes, make a bed, take out trash, reposition furniture, and close books without a central controller
- Fine motor tasks — unscrewing bottle caps, extracting pills from organisers, pushing precise syringe volumes, picking metal parts from cluttered bins using tactile fingertip sensing
- Stair and ramp navigation — Helix-02's System 0 (S0) controller now sees the environment through onboard cameras, allowing the robot to anticipate terrain changes rather than relying on hand-tuned mode switches
- Autonomous battery management — robots self-identify when to leave the line and signal for replacement without human coordination
Who Is Figure AI and Why Does This Matter
The company
Figure AI was founded in 2022 by Brett Adcock — also the founder of Archer Aviation and talent marketplace Vettery. The company has raised $1.9 billion in total funding, reaching a $39 billion valuation at its September 2025 Series C led by Parkway Venture Capital. Backers include NVIDIA, Microsoft, Intel Capital, Qualcomm Ventures, Salesforce, and LG Technology Ventures. It is currently private, with a potential IPO window speculated at 2027–2028.
The robot
The Figure F.03 stands 5 feet 8 inches tall, weighs 61 kilograms, carries 20-kilogram payloads, and walks at 1.2 metres per second. It has 16 degrees of freedom in its hands, tactile sensors in its fingertips capable of detecting forces as small as 3 grams, and cameras in each hand in addition to the head-mounted stereo vision system. It runs on a 2.3 kWh swappable battery that charges wirelessly via inductive floor pads.
Real commercial deployment — not just demos
Before the viral livestream, Figure had already delivered robots to BMW's Spartanburg, South Carolina plant — where Figure 02 ran daily 10-hour shifts for 11 months, loading over 90,000 parts and contributing to more than 30,000 vehicles. Those 1,250+ runtime hours of real factory data directly informed the Figure 03 redesign. The company is currently ramping F.03 production at its BotQ facility toward 12,000 units per year, with a longer-term target of 100,000 annually. Business model: Robot-as-a-Service, approximately $1,000 per robot per month.
Not everyone is convinced the livestream tells the whole story. Some observers noted that one of the F.03 robots appeared to touch its head at a point during the shift — a gesture some associate with VR headset operators performing teleoperation. Figure has not published detailed third-party verification of the full session. And separately, in November 2025, the company's former head of product safety filed a lawsuit alleging the safety roadmap was compromised to close the funding round — a claim Figure disputes. The 10 million views came with as much anxiety in the comment section as amazement. One Reddit user summed it up: the robots are "stealing jobs from warehouse workers AND streamers."
What Comes Next
During the livestream, CEO Brett Adcock announced the next challenge: a 24-hour continuous autonomous shift. If the 8-hour run drew 10 million views, the 24-hour attempt will be one of the most-watched robotic events in history.
Beyond the stunts, Figure's roadmap is focused on home deployment. Alpha testing of F.03 in real residential homes is underway, with broader home availability targeting late 2026. The goal Adcock has articulated publicly: robots performing household tasks in homes they have never visited before, without prior mapping or setup. That is a substantially harder problem than a controlled factory line — and it is where the real commercial scale lives.
Figure 04 was also teased during the stream, with Adcock describing it as incorporating "the biggest single upgrade iteration yet." No specifications were shared.
As someone who has spent 13 years building XR and AI systems for industrial environments — including VR safety training deployed at GITEX and ADIPEC — I have watched humanoid robot demos with healthy scepticism for years. The Figure livestream is different in one specific way: it was not a controlled two-minute clip. Eight hours is a real shift. The relay battery management alone — robots autonomously deciding when to leave and re-enter the line — is a non-trivial systems problem that most demos conveniently skip. The Helix-02 architecture replacing 109,000 lines of hand-engineered C++ with a single learned controller is the detail that matters most technically. That is the same trajectory we saw in software AI — at some point the learned system just outperforms the hand-coded one. For factories and warehouses, that inflection point appears to have arrived. The home environment is the genuinely hard problem remaining, and I expect Figure 04 is being built specifically to crack it.