The Breakdown:
- Super Odometry helps robots keep their sense of direction when smoke, darkness or dust block their cameras.
- The system allows robots to rely on internal motion sensors when they can’t see clearly.
- In real-world tests, it kept robots moving safely through tough conditions like stairs, low light and smoke.
***
When a drone enters a burning building to assist with search and rescue, visibility can vanish in seconds. Camera feeds dissolve into gray. Laser scanners scatter unpredictably off airborne ash. In the very environments where robots are needed most, the sensors they rely on can quickly become unreliable.
Carnegie Mellon University researchers in the Robotic Institute’s (RI) AirLab developed Super Odometry to address this challenge. Super Odometry combines data from multiple sensors to estimate a robot’s position, even when some of those sensors are degraded or failing. Odometry refers to tracking a robot’s motion over time to determine how its position changes. It aims to keep robots aware of their locations and working safely if cameras and sensors start to break down.

Super Odometry allows robots to complete missions in scenarios that would normally stop them in their tracks.
“In simple terms, Super Odometry helps robots keep track of where they are and where they’re going even when their usual sensors fail,” said Shibo Zhao, an RI Ph.D. student and first author of the system design. “It allows robots to stay in motion, make decisions and complete missions in scenarios that would normally stop them in their tracks.”
The AirLab is exploring applications for Super Odometry where resilient navigation could have an immediate impact, including wildfire monitoring, search and rescue, emergency response and firefighting support, autonomous driving, and large-scale mapping.
For decades, robotic navigation has depended heavily on external perception sensors. Techniques such as odometry and simultaneous localization and mapping (SLAM) use cameras and light detection and ranging (lidar) to estimate motion and construct maps, allowing machines to understand where they are and how they’re moving. In controlled or clear conditions, this approach works well. But in dark, smoky, dusty or rapidly changing environments, perception can break down and hinder a robot’s ability to orient itself and operate safely.
The research team, advised by RI faculty members Sebastian Scherer and Wenshan Wang, took inspiration from how humans react in low-vision environments. In dense fog or complete darkness, people rely more heavily on internal cues like balance and motion signals from the body to keep track of their position and decide where to move next. The researchers applied a similar idea to robotics by strengthening a robot’s “internal” sense of motion.
“External perception makes systems more accurate, but internal sensing helps systems survive,” Zhao said. “Our goal is to enable safer, more resilient autonomy in the environments that matter most.”
At the center of Super Odometry is a module that uses data from the robot’s inertial measurement unit — a sensor that measures acceleration and rotation — to estimate movement without reliable visual input. Rather than replacing traditional camera and lidar-based methods, Super Odometry blends them with this strengthened internal motion sensor and adjusts in real time. When conditions are clear, the system relies more on external sensors. As smoke, dust, darkness or fast movements interfere with visibility, it shifts preference to the internal module to continue tracking motion. The layered design allows the robot to respond differently depending on the disruption’s severity.
Super Odometry also uses that layered, hierarchical approach to balance efficiency and reliability. Small issues are handled with quick, lightweight adjustments. When conditions become more extreme, the system activates stronger and more computationally demanding tools. This method allows the robot to run efficiently during normal operation while still having the resilience to handle smoke, darkness or other severe disruptions when they arise.
The team evaluated Super Odometry across aerial, wheeled, legged and handheld platforms in real-world environments where cameras and sensors may struggle to work properly, for a total of more than 200 kilometers of field tests. They used the CMU campus itself in one test, running the system through a 2,966-meter, 46-minute continuous route that incorporated 13 complex degradation scenarios, including staircase exploration in low lighting, lens flare, reflective glass corridors and smoke. The system completed the entire route without a single failure.
Super Odometry was published and selected as a top feature article in the December 2025 edition of Science Robotics. To learn more about the system, visit the project website.
For More Information: Aaron Aupperlee | 412-268-9068 | aaupperlee@cmu.edu