3D Thermal Perception for Autonomous Navigation in Visually Degraded, Unstructured Environments - Robotics Institute Carnegie Mellon University

3D Thermal Perception for Autonomous Navigation in Visually Degraded, Unstructured Environments

Master's Thesis, Tech. Report, CMU-RI-TR-25-103, December, 2025

Abstract

Autonomous navigation in visually degraded and unstructured environments, such as darkness, smoke, and rough off-road terrain remains a significant challenge for current robotic systems. RGB cameras fail without illumination, and active sensors such as LiDAR degrade under aerosols and emit signals that are undesirable in sensitive or adversarial scenarios. In contrast, long-wave infrared (thermal) sensing captures naturally emitted radiation and maintains visibility through many atmospheric obscurants. This thesis develops a framework for passive thermal autonomy, enabling 3D stereo perception and autonomous navigation in challenging off-road terrain without active illumination.

The absence of large-scale thermal datasets, sensor-to-perception pipeline, and field deployments has long prevented reliable passive autonomy in low-visibility, off-road environments. This work addresses the complete pipeline, from rigorous sensor integration and custom cross-modality calibration, through large-scale data collection, to stereo thermal mapping and odometry, and autonomy deployment. To address the critical gap in thermal 3D vision benchmarks, we contributed two multi-modal datasets: FIReStereo for aerial platforms and TartanDrive 2.5T for ground vehicles, covering diverse off-road terrains and visibility conditions. We introduce MACThermal, which adapts metric- and uncertainty-aware covariance visual odometry to the thermal domain. It employs dynamic-range normalization and geometry-consistent flow augmentation to improve correspondence and uncertainty estimation. We integrate these perception modules into a complete autonomy stack, utilizing visual foundation models for semantic understanding and self-supervised traversability estimation for adaptive decision-making. The full system is validated on a full-scale ATV platform deployed in previously unseen off-road terrain under complete darkness. Together, this work demonstrates a shift from active LiDAR-based navigation to passive thermal autonomy, enabling nighttime traversals in previously inaccessible terrain.

BibTeX

@mastersthesis{Liu-2025-149749,
author = {Yifei Liu},
title = {3D Thermal Perception for Autonomous Navigation in Visually Degraded, Unstructured Environments},
year = {2025},
month = {December},
school = {Carnegie Mellon University},
address = {Pittsburgh, PA},
number = {CMU-RI-TR-25-103},
keywords = {3D Perception, Thermal Sensing, Autonomous Navigation, Off-road Driving, Field Robotics},
}