New Framework From CMU, NVIDIA Enables Stiff Humanoids To Move With Agility
After scoring a goal, soccer superstar Cristiano Ronaldo will run to the sideline, leap into the air, spin 180 degrees, land with arms outstretched and shout “Siu.” When LeBron James hits a clutch shot, he celebrates with the “Silencer,” rhythmically stomping his feet while forcefully pushing down with his arms to quiet any question of his greatness. Also on the court, Kobe Bryant’s signature fadeaway jumper is a masterclass in flair, precision and footwork, a nearly impossible-to-block shot with devastating accuracy.
These moves, recognizable throughout professional sports and permeating popular culture, inspired groundbreaking research from Carnegie Mellon University’s Robotics Institute and NVIDIA to push the boundaries of how humanoid robots move.
“Growing up watching athletes like Kobe Bryant, Cristiano Ronaldo and LeBron James, I was always inspired by the grace and power of human movement,” said Tairan He, a Ph.D. student in RI who worked on a new framework enabling stiff robots to move with fluidity and agility comparable to star athletes. “Seeing humanoid robots now replicate those same signature moves — moves that once seemed uniquely human — is both surreal and deeply rewarding.”
For years, humanoid robots have captured the attention of researchers and the imagination of the public. Yet imitating capturing the advanced coordination of humans has always been a challenge. In an exciting step forward, the Learning and Control for Agile Robotics (LeCAR) Lab at the RI worked with researchers from the NVIDIA GEAR robotics research lab to develop ASAP: Aligning Simulation and Real-World Physics for Learning Agile Humanoid Whole-Body Skills.
“The ASAP framework enables fast and reliable deployment of humanoid robots in a variety of tasks, which has great potential to revolutionize the role humanoid robots play in the human world,” said Changliu Liu, assistant professor in the RI.
More agile robots could navigate a person’s home and complete household chores more effectively. They could interact with patients or residents in care homes in softer and gentler fashions. The team is also working with CMU’s Entertainment Technology Center to explore ways to use humanoid robots as characters in amusement parks and determine whether robots could also compete in sports like boxing, football or soccer.
While recent studies have introduced human-like movement in humanoid robots, the efforts have focused on upper-body motions, which miss the diversity of human movement. ASAP addresses these challenges and overcomes obstacles such as hardware limits and the mismatch between simulated dynamics and real-world physics.
ASAP is a two-stage framework that enables agile humanoid whole-body skills. In the pretraining stage, the team trained a motion tracking policy by using videos of human movement as a data source. The human motions were retargeted to humanoid robots and the motion tracking policy was trained to follow them. But when the team deployed the policy on humanoid robots, the real-world physics did not match the simulation perfectly, resulting in lower performance from the robot.
The researchers addressed this in the post-training stage. In this stage, they collected data from their humanoid robots as they moved, including internal sensor readings and positions tracked by a motion capture system. The newly collected data is replayed in simulation to identify where real-world behavior differs from what the simulation expects. Once the differences were defined, the team could train their model to adjust to bring the real-world and simulation behaviors closer in similarity.
“It’s nontrivial to leverage real-world data effectively,” said Wenli Xiao, a robotics Ph.D. student and co-lead researcher on ASAP. “Given its noisy nature and limited availability, we needed a smarter way to extract information from it. We explored approaches like system identification and learning a world model, but ultimately developed deltaA, a data-efficient method that bridges simulation and reality through reinforcement learning, significantly enhancing performance.”
Using a humanoid robot, they conducted several experiments to test the agility and fluidity of motions, including jumping, kicking and squatting— motions similar to those athletes perform in training and games.
“It feels like we’re bridging science fiction and reality, one step at a time,” Tairan said.
From the experiments, the researchers found that ASAP demonstrated significant reductions in motion tracking errors and successfully deployed diverse, agile skills on the humanoid robot.
“The ASAP framework, through a ‘real2sim2real’ approach, combines the power of NVIDIA’s GPU-accelerated physics engines and the uncanny ability of modern neural nets to capture the complex dynamics of humanoid robots,” said Jim Fan, director of AI and distinguished scientist at NVIDIA.
ASAP was accepted to the 2025 Robotics: Science and Systems conference, where the team will present their research. The group is excited to share their work with the broader robotics community in hopes that the ASAP framework’s unique whole-body agility and real-world deployment inspires future breakthroughs in sim-to-real learning.
“The ASAP framework can be widely deployed in real-world humanoid tasks requiring whole-body agility, such as household chores, delivery and industrial inspection,” said Guanya Shi, an assistant professor in the RI. “The key idea of learning to bridge the sim-to-real gap can also be deployed in many other robotic problems such as dexterous manipulation.”
The ASAP team includes Carnegie Mellon students Tairan He, Jiawei Gao, Wenli Xiao, Yuanhang Zhang, Zi Wang, Jiashun Wang, Zhengyi Luo, Guanqi He, Nikhil Sobanbab, Chaoyi Pan and Zeji Yi. They were assisted in the project by RI faculty Guanya Shi, Kris Kitani, Jessica Hodgins, Changliu Liu; Electrical and Computer Engineering faculty Guannan Qu; and NVIDIA research scientists Linxi “Jim” Fan and Yuke Zhu.
For More Information: Aaron Aupperlee | 412-268-9068 | aaupperlee@cmu.edu