Humanoid Robotics Breakthroughs at AWE 2026 in Shanghai - Steves AI Lab

Humanoid Robotics Breakthroughs at AWE 2026 in Shanghai

One of the biggest consumer electronics and AI trade shows, AWE 2026 in Shanghai, China, recently concluded, showcasing some of the most advanced humanoid robots ever demonstrated. Unlike earlier robotic exhibits that focused on entertainment or scripted movements, this year’s highlight was something far more meaningful: robots performing highly delicate real-world industrial tasks. Engineers have struggled for decades to enable machines to handle deformable materials such as fabric, threads, and thin wires. At AWE 2026, however, that challenge saw a breakthrough.

The Challenge of Deformable Objects

One of the hardest problems in robotics is working with deformable objects—materials that do not maintain a fixed shape. Unlike rigid objects, fabrics, threads, and wires constantly change form depending on pressure, tension, and movement. This makes them extremely difficult for robots to predict and manipulate.

Tasks like embroidery or automotive wire harness assembly require extreme precision, fine motor control, and real-time adjustment. Even the slightest miscalculation can ruin the entire process. Until now, these jobs were considered too complex for full automation, forcing industries to rely heavily on human labor.

TARS and the Rise of Embodied AI Models

A major highlight of the expo came from a relatively new company called TARS, founded in February 2025. Despite being a young startup, TARS made a strong impression with its advanced humanoid robotics system and even attracted national media attention.

The company introduced a new robotics philosophy, which they describe as “super algorithms + super embodiment + super applications.” This approach combines AI reasoning, physical interaction capability, and real-world use cases into a unified system.

SenseHub and the Power of Real-World Data

One of the most innovative contributions from TARS is a data collection system called SenseHub. This system captures human actions in real-time using wearable gloves and multi-sensor cameras that record touch, movement, pressure, and spatial orientation.

Operators do not need to code or program anything; they simply perform natural tasks. The system records everything: finger movements, grip strength, joint angles, and even subtle adjustments made during work. This dataset, called WIH (World in Your Hands), is one of the first large-scale multimodal robotics datasets combining vision, language, touch, and motion.

This approach allows robots to learn directly from human experience rather than synthetic simulations, significantly improving adaptability and precision.

Performance Breakthroughs and Real-World Impact

The results of this new training method are impressive. Robots trained using this system show a threefold improvement in handling unfamiliar environments and a 45% reduction in motion errors.

In a landmark achievement, the TARS robot set a Guinness World Record by performing 105 ultra-precise wire harness assembly operations in one hour, handling wires thinner than one millimeter. These tasks are physically demanding even for humans and often result in high workplace turnover due to strain and fatigue.

Future Applications and Robotics Evolution

TARS has developed two robot series: the A-series for stable indoor industrial environments and the T-series for complex terrains like stairs and uneven surfaces. Both are powered by the same advanced AI system, making them highly versatile across industries such as electronics manufacturing and automotive assembly.

Follow Us on:
Clutch
Goodfirms
Linkedin
Instagram
Facebook
Youtube